Monday, October 21, 2013

Creating instances for a SharePoint 2013 Windows Azure farm

Unfortunately, Microsoft has decided that you can not install SharePoint 2013 on a domain controller.  This has the unfortunate side effect of mandating that you have at least 2 virtual machines running in order to create a SharePoint 2013 development environment.  ​This post will cover the steps needs for creating a 2 server instances for the SharePoint 2013 development.  One instance will be a simple domain controller.  The other instance will the more traditional server which will contain the production you wish to use for doing your SharePoint development.  This post will NOT cover installing SharePoint or SQL Server or setting up a domain controller.  This post will discuss how to setup the 2 instances in Windows Azure.
This post assumes you already have Windows Azure PowerShell installed locally with the subscription settings and a storage account setup.  If you do not have that already done, then you can view my Getting started with Microsoft Windows Azure PowerShell post for instructions on how to accomplish that.  
  

Script file and Network Config XML for your Domain Controller

Here are the contents of my CreateDC.ps1 script, which I used to create a new Windows Azure instance that will become my domain controller. 
 
$moduleLocation = "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
$publishSettingsLocation = "<C:\Users\<UserProfile>\Downloads\<SubscriptionName>-credentials.publishsettings>"
$subscriptionName = "Visual Studio Premium with MSDN"
$storageAccountName = "<StorageAccountName>"
$affinityGroup = "<AffinityGroupName>"
$myWindowsAdminPassword = '<local admin account password>'
 
#This will need to unique within this Azure service
$rdpPort = '<the port to use form RDP acccess to this box>'

$myAzureVNetXML = "C:\Azure\scripts\NetworkConfig.xml"
 
#If you rerun the script, you need to supply a unique value for $dcServiceName
$vmname = 'DC1'
 
#This name will need to be unique for Windows Azure, so you may want to make
#this name fairly unique
$dcServiceName = '<service name>'     

$vnet = 'Sp2013VN'

#Import-Module $moduleLocation
#Import-AzurePublishSettingsFile $publishSettingsLocation
Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccount $storageAccountName
Select-AzureSubscription -SubscriptionName $subscriptionName

#Create the virtual network based on the settings in the XML file at this path
# note the $affinityGroup, $vmname and $vnet values must those in the NetworkConfile.xml file.
Set-AzureVnetConfig -ConfigurationPath $myAzureVNetXML

# OS Image to Use.  This valus comes from the list of disk images that are
#available.
$image = "a699494373c04fc0bc8f2bb1389d6106__Win2K8R2SP1-Datacenter-201305.01-en.us-127GB.vhd"
 
#VM Configuration
$MyDC = New-AzureVMConfig -name $vmname -InstanceSize 'Small' -ImageName $image |
    Add-AzureProvisioningConfig -Windows -AdminUsername $myWindowsAdminName -Password $myWindowsAdminPassword|
    Set-AzureSubnet -SubnetNames 'Subnet-1' |
    Remove-AzureEndpoint –Name "RDP" |
    Add-AzureEndpoint -LocalPort 3389 -Name 'RDP' -Protocol tcp -PublicPort $rdpPort

New-AzureVM -ServiceName $dcServiceName -AffinityGroup $affinityGroup -VMs $MyDC -VNetName $vnet 
 
 And, here is the contents of the NetworkConfig.xml file mentioned in the above post.
<NetworkConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration">
  <VirtualNetworkConfiguration>
    <Dns>
      <DnsServers>
        <DnsServer name="Sp2013DC" IPAddress="10.0.0.4" />
      </DnsServers>
    </Dns>
    <VirtualNetworkSites>
      <VirtualNetworkSite name="Sp2013VN" AffinityGroup="<AffinityGroupName>">
        <AddressSpace>
          <AddressPrefix>10.0.0.0/8</AddressPrefix>
        </AddressSpace>
        <Subnets>
          <Subnet name="Subnet-1">
            <AddressPrefix>10.0.0.0/11</AddressPrefix>
          </Subnet>
        </Subnets>
        <DnsServersRef>
          <DnsServerRef name="Sp2013DC" />
        </DnsServersRef>
      </VirtualNetworkSite>
    </VirtualNetworkSites>
  </VirtualNetworkConfiguration>
</NetworkConfiguration>
 

Execute the CreateDC PowerShell script

To execute the script file,  ...
 
  1. Open the Windows Azure PowerShell console as Administrator. 
  2. Change directory, CD, to the folder containing the script.  Example CD c:\MyScripts
  3. Enter in the script filename .\CreateDC.ps1
After a several moments, you will have a new Windows Azure service called <service name> that contains 1 virtual machine that will become the domain controller.  You will be able to remote desktop into the domain controller by using <service name>.cloudapp.net:<the port to use form RDP acccess to this box>
 
You will need to remote into that server and promote it to be a domain controller before adding additional virtual machines to this service.
  

Create SharePoint server instance

Here are the contents of my CreateSP.ps1 script, which I used to create a new Windows Azure instance that will become my SharePoint development server.  This virtual machine will be added to the virtual network that I created in the previous steps. 
 
$moduleLocation = "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
$publishSettingsLocation = "<C:\Users\<UserProfile>\Downloads\<SubscriptionName>-credentials.publishsettings>"
$subscriptionName = "Visual Studio Premium with MSDN"
$storageAccountName = "<StorageAccountName>"
$affinityGroup = "<AffinityGroupName>"
$myWindowsAdminName = '<local admin account name>'
$myWindowsAdminPassword = '<local admin account password>'
$vmname = 'SP2013'
 
#This will need to unique within this Azure service
$rdpPort = '<the port to use form RDP acccess to this box>'
#use the same service name and virtual network name that you used to create your domain controller
$serviceName = '<service name>'
$vnet = '<virtual network>'


Import-Module $moduleLocation
Import-AzurePublishSettingsFile $publishSettingsLocation
Set-AzureSubscription -SubscriptionName $subscriptionName -CurrentStorageAccount $storageAccountName
Select-AzureSubscription -SubscriptionName $subscriptionName
 
 
# OS Image to Use
$image = "a699494373c04fc0bc8f2bb1389d6106__Windows-Server-2012-Datacenter-201306.01-en.us-127GB.vhd"

#VM Configuration
$MyDC = New-AzureVMConfig -name $vmname -InstanceSize 'Medium' -ImageName $image |
         Add-AzureProvisioningConfig -Windows -AdminUsername $myWindowsAdminName -Password $myWindowsAdminPassword|
         Set-AzureSubnet -SubnetNames 'Subnet-1' |
         Remove-AzureEndpoint –Name "RDP" |
         Add-AzureEndpoint -LocalPort 3389 -Name 'RDP' -Protocol tcp -PublicPort $rdpPort

New-AzureVM -ServiceName $serviceName -VMs $MyDC -VNetName $vnet
 

Execute the CreateSP PowerShell script

To execute the script file,  ...
 
  1. Open the Windows Azure PowerShell console as Administrator. 
  2. Change directory, CD, to the folder containing the script.  Example CD c:\MyScripts
  3. Enter in the script filename .\CreateSP.ps1
After a several moments, you will have a new virtual machine created within the Windows Azure service called <service name>.  This server become your SharePoint server.  You will be able to remote desktop into the virtual machine by using <service name>.cloudapp.net:<the port to use form RDP acccess to this box>.
 
 

Uploading a HyperV VHD into Windows Azure and creating a Virtual Instance.

I have several legacy VHDs that are taking up space on my laptop.  I decided to move one of them up to Azure to test how this would work and the potential benefits and drawbacks.  I selected a VHD that is a Windows Server 2003 running SharePoint 2007 (WSS 3.0) and SQL Server 2005.  This is a self contained configured environment.  I put together the following PowerShell script to load the VHD into Windows Azure.  In order to execute this script you must have Windows Azure cmdlets installed locally, your MSDN subscription settings saved locally, and have a Windows Azure storage account setup.

This post assumes you already have Windows Azure PowerShell installed locally with the subscription settings and a storage account setup.  If you do not have that already done, then you can view my Getting started with Microsoft Windows Azure PowerShell post for instructions on how to accomplish that.

Load VHD into Azure Storage Account

Here are the contents of my UploadVHDFile.ps1 script, which I used to upload a HyperV VHD into Windows Azure.

$moduleLocation = "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
$publishSettingsLocation = "<C:\Users\<UserProfile>\Downloads\<SubscriptionName>-credentials.publishsettings>"
$subscriptionName = "<Visual Studio Premium with MSDN>"
$storageAccountName = "<StorageAccountName>"
$diskName = "<disk name>"
 
# Source VHD
$vhdsource = '<the complete path of the VHD file example: c:\vhds\myimage.vhd>'
 
# Upload Location
$vhddestination= 'http://' + $storageAccountName + '.blob.core.windows.net/vhds/' + $diskName + '.vhd'

Import-Module $moduleLocation
Import-AzurePublishSettingsFile $publishSettingsLocation
    
Add-AzureVhd -LocalFilePath $vhdsource -Destination $vhddestination
    
Add-AzureDisk -OS Windows -MediaLocation $destosvhd -DiskName $diskName


To execute the script file,  ...

  1. I open the Windows Azure PowerShell console as Administrator. 
  2. I change directory, CD, to the folder containing the script.  Example CD c:\MyScripts
  3. I enter in the script filename .\UploadVHDFile.ps1



After serveral minutes / hours, the new DISK will be loaded into your Azure storage account.



 

Create Virtual Machine from Disk


Here are the contents of my CreateVirtualMachine.ps1 script file which I used to create a Virtual Machine from the VHD that I loaded into Windows Azure.

 
$moduleLocation = "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1"
$publishSettingsLocation = "<C:\Users\<UserProfile>\Downloads\<SubscriptionName>-credentials.publishsettings>"
$subscriptionName = "<Visual Studio Premium with MSDN>"
$storageAccountName = "<StorageAccountName>"
$diskName = "<disk name>"
$location = 'East US'
 
# Has to be a unique name. Verify with Test-AzureService
$serviceName = "<UNIQUE SERVICE NAME>"

# Server Name
$vmname1 = "<MY VM NAME>"

    
$migratedVM = New-AzureVMConfig -Name $vmname1 -DiskName $diskName -InstanceSize 'Medium' |
     Add-AzureEndpoint -Name 'Remote Desktop' -LocalPort 3389 -Protocol tcp

New-AzureVM -ServiceName $serviceName -Location $location -VMs $migratedVM

 
To execute the script file,  ...



  1. I open the Windows Azure PowerShell console as Administrator. 
  2. I change directory, CD, to the folder containing the script.  Example CD c:\MyScripts
  3. I enter in the script filename .\CreateVirtualMachine.ps1

 
 


After a couple minutes, the new virtual machine will be available for you to use in Windows Azure.

 

 

Benefits and Drawbacks

The huge benefit is that you are able to make a virtual machine more accessible for your use.  You can add more resources be changing the instance size.  You can free up space on your local harddrive.  One additional plus, is that you are able to load images into Windows Azure that are no longer natively supported through their website, such as Windows Server 2003.  The downside is that you now incur a cost.  There is a fee just for storing an image out in Windows Azure.  Plus, you have to pay a cost for the time the Virtual Machine is allocated.   And to be clear, an instance that is Shut Down through windows is still allocated. You must use the Windows Azure website or PowerShell console to stopped the instance so that you don't get charged.

Getting started with Microsoft Windows Azure PowerShell

The http://manage.windowsazure.com/ site is the web based interface that allows a user to manage an Azure environment.  There is also a PowerShell interface that you can use to manage your Azure environment and that is what this article will discuss.  This page, Windows Azure PowerShell, is a great starting resource.  There are a few steps that we need to go through in order to get everything ready for you to start working with Azure Virtual Machines. 


The How to install and configure Windows Azure PowerShell page provides the link that you will need to download Windows Azure PowerShell and connect to your subscription.  I have pulled those 2 sections into this post.

Install Windows Azure PowerShell on your computer

You can download and install the Windows Azure PowerShell module by running the Microsoft Web Platform Installer. When prompted, click Run. The Microsoft Web Platform Installer loads, with the Windows Azure PowerShell module available for installation. The Web Platform Installer installs all dependencies for the Windows Azure PowerShell cmdlets. Follow the prompts to complete the installation. 

On a Windows 8, you should see a Windows Azure PowerShell icon on your start screen.  If you are using the start button, you should have a Windows Azure group and within that group you should see Windows Azure PowerShell.  You should run as administrator when you start the Windows Azure PowerShell console through either one of those methods.

Connect to your subscription

Use of Windows Azure requires a subscription. If you don't have a subscription, see Get Started with Windows Azure.

The cmdlets require your subscription information so that it can be used to manage your services. This information is provided by downloading and then importing it for use by the cmdlets. The Windows Azure PowerShell module includes two cmdlets that help you perform these tasks:

  • The Get-AzurePublishSettingsFile cmdlet opens a web page on the [Windows Azure Management Portal]( from which you can download the subscription information. The information is contained in a .publishsettings file.
  • The Import-AzurePublishSettingsFile imports the .publishsettings file for use by the module. This file includes a management certificate that has security credentials.

Important
We recommend that you delete the publishing profile that you downloaded using Get-AzurePublishSettingsFileafter you import those settings. Because the management certificate includes security credentials, it should not be accessed by unauthorized users. If you need information about your subscriptions, you can get it from the Windows Azure Management Portal or the Microsoft Online Services Customer Portal.

  1. Sign in to the Windows Azure Management Portal using the credentials for your Windows Azure account.
  2. Open the Windows Azure PowerShell console, as instructed in How to: Install Windows Azure PowerShell.
  3. Type the following command:
    Get-AzurePublishSettingsFile
  4. When prompted, download and save the publishing profile and note the path and name of the .publishsettings file. This information is required when you run the Import-AzurePublishSettingsFile cmdlet to import the settings. The default location and file name format is:
    C:\Users\<UserProfile>\Desktop\[MySubscription-…]-downloadDate-credentials.publishsettings
  5. Type a command similar to the following, substituting your Windows account name and the path and file name for the placholders:
    Import-AzurePublishSettingsFile C:\Users\<UserProfile>\Downloads\<SubscriptionName>-credentials.publishsettings
  6. To view the subscription information, type:
    Get-AzureSubscription

Next: Create an Azure Affinity Group

According to Create an Affinity Group in the Management Portal, an Azure Affinity Group allows you to group your Windows Azure services to optimize performance. All services within an affinity group will be located in the same data center. An affinity group is required in order to create a virtual network.

  1. Open the Windows Azure PowerShell console as Administrator.
  2. Import your subscription information using this command:
    Import-AzurePublishSettingsFile "C:\Users\<UserProfile>\Downloads\<SubscriptionName>-credentials.publishsettings"

  3. Type the following command:
    New-AzureAffinityGroup -Name <affinitygroup> -Location "East US" -Label "East US" -Description "Affinity group for production applications in East US."

Finally: Create a Storage Account

According to How To Create a Storage Account, an Azure Storage Account is how you store files and data in the Blob, Table, and Queue services in Windows Azure.  You assign your storage account to a geographic region where you want to store the data.  A storage account can contain up to 100 TB of blob, table, and queue data. You can create up to five storage accounts for each Windows Azure subscription.

You can setup a storage account using the New-AzureStorageAccount cmdlet.  The following steps provide an example of how to do this.  The steps assume you are opening a new Windows Azure PowerShell console.  You can skip to step 3 if you already have a console window open.
Open the Windows Azure PowerShell console as Administrator.
  1. Import your subscription information using this command:
    Import-AzurePublishSettingsFile "C:\Users\<UserProfile>\Downloads\<SubscriptionName>-credentials.publishsettings"

  2. Type the following command:
    New-AzureStorageAccount -StorageAccountName <StorageAccountName> -AffinityGroup <affinitygroup> -Location "East US"
  3. Type the following command:
    New-AzureStorageAccount -StorageAccountName <StorageAccountName> -AffinityGroup <affinitygroup> -Location "East US"

Saturday, December 29, 2012

Downgrading a SharePoint 2010 SQL Server

On one occasion, I came across an instance of a 2 server SharePoint farm.  One server was a SQL Server 2008 R2 database server and the other was the SharePoint 2010 Standard web front end.  The farm was initially setup using the Enterprise edition of SQL Server.  There is nothing wrong with using SQL Server 2008 Enterprise and SharePoint does take advantage of some of the Enterprise features.  The request came to add SQL Server Reporting Services (SSRS) into the Farm.  There are 3 different flavors of SharePoint to SSRS integration with this farm configuration.

1.) Install SSRS in native mode on the database server.  You then use the native mode SSRS web parts to render the reports within SharePoint.

2.) Install SSRS in integrated mode on the database server. The downside to this is that you need to perform a minimal SharePoint web front end installation on the database server. 

3.) Install SSRS in integrated mode on the web front-end server.  There are two things to be mindful of with this approach.  First, you will need a separate SQL license.  Second, Microsoft states that you must use the SAME VERSION of the SQL Server on the web front end as you have on the database server.

The company wanted to use the SSRS in integrated mode AND they did not want to perform a minimal SharePoint installation on the database server.  This only left option 3.  For the farm in question, they had already setup the farm using SQL Server 2008 R2 Enterprise.  This meant they would need to install SQL Server 2008 R2 Enterprise on the web front end server.  There is quite a difference in price between SQL Server 2008 R2 Standard and SQL Server 2008 R2 Enterprise.  Therefore, we decided to downgrade the existing database server.


There are plenty of posts that cover moving a farm from one SQL Server to another.  Here is a short list steps of that I found at http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=158996 .  These are pure SQL server steps and do not account for setting up the SharePoint configuration databases.   In a nutshell, the SQL steps are:
  1.) Create a script of the custom server roles before downgrading the server
  2.) Detach the User Databases
  3.) Generate database script to create users
  4.) Physical files to backup system databases MASTER, MODEL, MSDB
  5.) Uninstall SQL Server 2008 Enterprise
  6.) Install SQL Server 2008 Standard
  7.) Attach the User Databases
  8.) Run the script to recreate the custom server roles (make sure Farm Admin account have securityadmin and dbcreator rights)

After installing SQL Server 2008 R2 STANDARD, we started re-attaching all of the SharePoint databases.  We encountered this error "Database ‘WebAnalyticsReportingDB’ already exists." as we tried to re-attach the database.  In typical Microsoft fashion, this error is not reflecting the actual problem.  In order to see the REAL error, you will need to open up SQL Server Management Studio (SSMS) and review the SQL Server logs.  You can see the SQL Server logs by
  1.) Open SQL Server Management Studio
  2.) In the object browser
  3.) Open the sql instance
  4.) Then management
  5.) Then sql server logs



You will see this when you review the log contents


The message "Database 'WebAnalyticsReportingDB' cannot be started in this edition of SQL Server because it contains a partition function 'DataIdPF'." makes sense because SQL Server 2008 R2 Enterprise supports partitioning and SQL Server 2008 R2 standard does not.

To resolve this problem
  1.) Make sure that both WebAnalyticsReportingDB and WebAnalyticsStagingDB are not on the SQL instance.
  2.) Re-provision the service in SharePoint using a powershell script from this technet site (http://technet.microsoft.com/en-us/library/gg266382.aspx)


I have also run into a similar with the Secure Store service.  If this happens to you, then you will need to recreate the Secure Store service in Central Administration.

Sunday, February 26, 2012

Returning a value from a dynamic SQL Statement

There have been times when I have needed to create a dynamic SQL Statement within a stored procedure.  You need to consider permissions of the process that is executing the dynamic sql statement but that will not be addressed here.  I had a need on a recent project to be able to execute a dynamic sql statement from a stored procedure and use its result within that same procedure.  Here is simple example of how you can do this.

This stored procedure accepts two integer parameters.  It converts them into a string values as it builds the SQL statement.  The key is that the data we desire to use is inserted into a temporary table within the dynamic sql statement.  This temporary table is created within this stored procedure and is accessible by the dynamic sql statement.  This allows us to query the data from the temporary table and use it later on within the stored procedure.


CREATE PROCEDURE dbo.Test
  @a int
  , @b int
AS
BEGIN

 DECLARE @return_status int
 DECLARE @sql varchar(8000)
   
 SET @sql = 'INSERT INTO #Temp(result_value)
             SELECT ' + convert(varchar(20), @a) + ' + ' + convert(varchar(20), @b);
   
 CREATE TABLE #Temp (
      result_value int null
 )


 EXEC (@sql);

 SELECT *
 FROM #Temp

 DROP TABLE #temp

END



This is the execution and results.

exec dbo.Test 1, 2

result_value
3

(1 row(s) affected)

SharePoint Online Development experiences

These are some things that I have learned while doing SharePoint Online development that are not explicitly stated on the SharePoint online developer site (http://msdn.microsoft.com/en-us/library/hh147180.aspx). Microsoft’s basic philosophy is that you can deploy sandbox solutions. Well … that is not completely true. Here are some things you can do in sandbox solutions that you cannot do SharePoint online

1.) You do not have access to some of the most useful objects in the Microsoft.SharePoint.Publishing assembly.  This make it difficult to make adjustments to the Global Navigation in code. For example, if you try to reference the PublishingWeb object, you get this error when you attempt to load the solution into the solution gallery.
















2.) You do not have access to the SPLimitedWebPartManager. This rules out adding or modifying web parts in code. The funny thing is their site says that you can deploy web parts. However, you are not allowed to add your deployed web part to a page via code.

System.TypeLoadException: Could not load type 'Microsoft.SharePoint.WebPartPages.SPLimitedWebPartManager' from assembly 'Microsoft.SharePoint, Version=14.900.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c'


3.) Calendar Overlays are not functional even though they appear to be available through the SharePoint Online UI. If you try to add the overlay via code, you get this message.

System.MissingMethodException: Method not found: 'Microsoft.SharePoint.SPForm Microsoft.SharePoint.SPFormCollection.get_Item(Microsoft.SharePoint.PAGETYPE)'.

This is a documented bug. It’s not specific to SharePoint Online.  The Calendar Overlay functionality only works in the Default Zone.


4.) At one point, I added a simple webpart to my Visual Studio 2010 SharePoint project.  The web part only printed into a label the text "Hello World". I was able to deploy the solution into the Solution Gallery. However, I received the following error when I tried to insert the web part into a page.












I got around this issue by separating the control out into a different project.

Tuesday, September 6, 2011

Adding jQuery to NewForm and EditForms

There are a times when you need to add more functionality to your SharePoint NewForms and EditForms.  For example, the out-of-the-box SharePoint NewForm / EditForm does not provide the ability to add an input mask to a site column.  jQuery will allow you do to this as shown here.


You can accomplish this goal using either SharePoint Designer or Visual Studio.  I prefer to create solutions that are easily redeployable between development, test, and production; therefore, I am going to use Visual Studio to accomplish this.  To accomplish our goal, we will create a web scoped feature receiver that adds a Content Editor Web Part containing our jQuery commands to a lists NewForm and EditForm. 

I am not going to discuss the setup of the Visual Studio project file.  Instead, I will focus on the code required for the solution.  This is a boiler plate feature activated event handler that allows any user to activate the web scoped feature.  The real work is done within the  AddContentEditorWebPart method.

public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
  SPWeb web = properties.Feature.Parent as SPWeb;
  SPSecurity.RunWithElevatedPrivileges(() =>    
  {
    using (SPSite elevatedSite = new SPSite(web.Site.ID))
    using (SPWeb elevatedWeb = elevatedSite.OpenWeb(web.ID))
    {
      AddContentEditorWebPart(elevatedWeb);
    }
  });
}

This method creates a list object, then creates a file object for the new form and edit form for this list.  Once you  have the file, you can create a web part manager object and then add in a content editor web part.  The final part of this solution is the XML representation of the web part definition.  I great way to get the base Content Editor Web Part XML is to add a Content Editor Web Part to a SharePoint web part page then select to export the defintion.

private void AddContentEditorWebPart(SPWeb elevatedWeb)
{
  string listName = "Your list title.";     
  SPList list = sysWeb.Lists[listName];
  string xmlWebPartDef = "This is the XML representing the web part (see example below)"; 
 
  for (int i = 0; i < 2; i++)   
   {
    SPFile frontPage = null;   
 
      frontPage = sysWeb.GetFile(list.DefaultNewFormUrl);      
 
    if (i == 0)   
    {
    }
    else if (i == 1)
    {
      frontPage = sysWeb.GetFile(list.DefaultEditFormUrl);
    }


    SPLimitedWebPartManager mgr = frontPage.GetLimitedWebPartManager(PersonalizationScope.Shared);

    XmlReader xmlReader = XmlTextReader.Create(new StringReader(xmlWebPartDef));   
    string errorMessage;   
    var contentEditorWebPart = mgr.ImportWebPart(xmlReader, out errorMessage);
    mgr.AddWebPart(contentEditorWebPart, "Main", 999);


  }
}

Once you have that XML, you can add in your jQuery call.  For our original example, we are trying to add a mask to the Home Phone field.  I added my jQuery logic to the CONTENT node near the bottom of this example.

<WebPart xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/WebPart/v2">
  <Title>AddMaskedControls</Title>
  <FrameType>None</FrameType>
  <ChromeType>None</ChromeType>
  <Description>Allows authors to enter rich text content.</Description>
  <IsIncluded>true</IsIncluded>
  <ZoneID>Main</ZoneID>
  <PartOrder>99</PartOrder>
  <FrameState>Normal</FrameState>
  <Height />
  <Width />
  <AllowRemove>true</AllowRemove>
  <AllowZoneChange>true</AllowZoneChange>
  <AllowMinimize>true</AllowMinimize>
  <AllowConnect>true</AllowConnect>
  <AllowEdit>true</AllowEdit>
  <AllowHide>true</AllowHide>
  <IsVisible>true</IsVisible>
  <DetailLink />
  <HelpLink />
  <HelpMode>Modeless</HelpMode>
  <Dir>Default</Dir>
  <PartImageSmall />
  <MissingAssembly>Cannot import this Web Part.</MissingAssembly>
  <PartImageLarge>/_layouts/images/mscontl.gif</PartImageLarge>
  <IsIncludedFilter />
  <Assembly>Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly>
  <TypeName>Microsoft.SharePoint.WebPartPages.ContentEditorWebPart</TypeName>
  <ContentLink xmlns="
http://schemas.microsoft.com/WebPart/v2/ContentEditor" />
  <Content xmlns="
http://schemas.microsoft.com/WebPart/v2/ContentEditor"><![CDATA[
   <script type="text/javascript" src="/_layouts/1033/jquery.maskedinput-1.2.2.js"></script>
   <script type="text/javascript" language="javascript">
     $(document).ready(function() {
                 $("input[title='Home Phone']").mask("999-999-9999");
     });

      </script>

]]></Content>
  <PartStorage xmlns="
http://schemas.microsoft.com/WebPart/v2/ContentEditor" />
</WebPart>


There are a couple notes that should be touched upon regarding the data in the  CONTENT node.  The jquery.maskedinput-1.2.2.js file is not part of the SharePoint installation so I copied that file in the SharePoint hive.  The jQuery reference is locating the field by the controls DISPLAY NAME.  The mask will be saved as well as the data, so our home phone data will look like 555-123-4567.  This content editor web part XML was pulled from a SharePoint 2010 site.  The same example will work with a SharePoint 2007 site with one change.  The assembly reference should be Microsoft.SharePoint, Version=12.0.0.0 instead.

This is a simple and powerful way of getting jQuery into your SharePoint NewForm / EditForms.