Pages

Showing posts with label MOSS 2007. Show all posts
Showing posts with label MOSS 2007. Show all posts

Friday, April 8, 2011

How to programmatically copy Web Level, List level and List Item Level Security Information for SharePoint Sites

 

[Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

Some Background

[See my other blog on Drawbacks of current List Export and Import MOSS 2007 Content Migration APIs]

How to fix this gap?

Well, there are SharePoint APIs you can use to copy all the necessary missing security information. But there needs to be an order to perform the process most efficiently. Follow below order of steps:

  • First you will be creating a new site collection, or you may already have your target importing site collection existing.
  • In any case before you perform your Import, you want to first copy over the Custom Permission Level.
  • Next you want to perform your usual Import with All security Included. (settings.IncludeSecurity = SPIncludeSecurity.All)
  • Above Import process will ensure to import all the Users and SharePoint Groups and any OOTB Permission Levels as applicable.
  • Now you want to Copy all the Web level Permission Assignments. Having all ready  copied the custom permission levels, this step will ensure all the permission assignments are appropriately copied over.
  • Next you can attend for each of the lists and libraries you have imported, copy the Permission Assignments.
  • Next you can attend for each Items in your lists and libraries you have imported, copy the Item Level Permission Assignments.

 

Below are the methods for each of these above tasks:

Your call order:

1. PrepareNewSiteCollection();//Here either create the new site collection in preparation for Import, or simply establish your site collection SPSite object.

2. CopyWebRoleAssignments(); //Here copy the web level permission levels.

3. PerformListExportImporting(); //Here perform your List Export and Import.

4. CopyWebRoleAssignments();//Here perform your Web level Permission Assignments copy.
5. CopyListRoleAssignments(); //Here finally copy your List and List item level permissions.

 

Common Methods:

        public static void CopyWebRoles(SPWeb sourceWeb, SPWeb destinationWeb)
{

//First copy Source Web Role Definitions to the Destination Web
foreach (SPRoleDefinition roleDef in sourceWeb.RoleDefinitions)
{
//Skip WSS base permission levels
if (roleDef.Type != SPRoleType.Administrator
&& roleDef.Type != SPRoleType.Contributor
&& roleDef.Type != SPRoleType.Guest
&& roleDef.Type != SPRoleType.Reader
&& roleDef.Type != SPRoleType.WebDesigner
)
{
//handle additon of existing permission level error
try { destinationWeb.RoleDefinitions.Add(roleDef); }
catch (SPException) { }
}
}


}


public static void CopyWebRoleAssignments(SPWeb sourceWeb, SPWeb destinationWeb)
{

//Copy Role Assignments from source to destination web.
foreach (SPRoleAssignment sourceRoleAsg in sourceWeb.RoleAssignments)
{
SPRoleAssignment destinationRoleAsg = null;

//Get the source member object
SPPrincipal member = sourceRoleAsg.Member;

//Check if the member is a user
try
{
SPUser sourceUser = (SPUser)member;
SPUser destinationUser = destinationWeb.AllUsers[sourceUser.LoginName];
if (destinationUser != null)
{
destinationRoleAsg = new SPRoleAssignment(destinationUser);
}
}
catch
{ }

if (destinationRoleAsg == null)
{
//Check if the member is a group
try
{
SPGroup sourceGroup = (SPGroup)member;
SPGroup destinationGroup = destinationWeb.SiteGroups[sourceGroup.Name];
destinationRoleAsg = new SPRoleAssignment(destinationGroup);
}
catch
{ }
}

//At this state we should have the role assignment established either by user or group
if (destinationRoleAsg != null)
{

foreach (SPRoleDefinition sourceRoleDefinition in sourceRoleAsg.RoleDefinitionBindings)
{
try { destinationRoleAsg.RoleDefinitionBindings.Add(destinationWeb.RoleDefinitions[sourceRoleDefinition.Name]); }
catch { }
}

if (destinationRoleAsg.RoleDefinitionBindings.Count > 0)
{
//handle additon of an existing permission assignment error
try { destinationWeb.RoleAssignments.Add(destinationRoleAsg); }
catch (ArgumentException) { }
}

}

}

//Finally update the destination web
destinationWeb.Update();

}


public static void CopyListRoleAssignments(SPList sourceList, SPList destinationList)
{
//First check if the Source List has Unique permissions
if (sourceList.HasUniqueRoleAssignments)
{

//Break List permission inheritance first
destinationList.BreakRoleInheritance(true);

//Remove current role assignemnts
while (destinationList.RoleAssignments.Count > 0)
{
destinationList.RoleAssignments.Remove(0);
}


//Copy Role Assignments from source to destination list.
foreach (SPRoleAssignment sourceRoleAsg in sourceList.RoleAssignments)
{
SPRoleAssignment destinationRoleAsg = null;

//Get the source member object
SPPrincipal member = sourceRoleAsg.Member;

//Check if the member is a user
try
{
SPUser sourceUser = (SPUser)member;
SPUser destinationUser = destinationList.ParentWeb.Users.GetByEmail(sourceUser.Email);
destinationRoleAsg = new SPRoleAssignment(destinationUser);
}
catch
{ }

if (destinationRoleAsg == null)
{
//Check if the member is a group
try
{
SPGroup sourceGroup = (SPGroup)member;
SPGroup destinationGroup = destinationList.ParentWeb.SiteGroups[sourceGroup.Name];
destinationRoleAsg = new SPRoleAssignment(destinationGroup);
}
catch
{ }
}

//At this state we should have the role assignment established either by user or group
if (destinationRoleAsg != null)
{

foreach (SPRoleDefinition sourceRoleDefinition in sourceRoleAsg.RoleDefinitionBindings)
{
try { destinationRoleAsg.RoleDefinitionBindings.Add(destinationList.ParentWeb.RoleDefinitions[sourceRoleDefinition.Name]); }
catch { }
}

if (destinationRoleAsg.RoleDefinitionBindings.Count > 0)
{
//handle additon of an existing permission assignment error
try { destinationList.RoleAssignments.Add(destinationRoleAsg); }
catch (ArgumentException) { }
}

}

}

//Does not require list update
//destinationList.Update();
}
else
//No need to assign permissions
return;



}

public static void CopyListItemsRoleAssignments(SPList sourceList, SPList destinationList)
{
foreach (SPListItem sourceListitem in sourceList.Items)
{
CopyListItemRoleAssignments(sourceListitem, destinationList.GetItemById(sourceListitem.ID));
}

}
public static void CopyListItemRoleAssignments(SPListItem sourceListItem, SPListItem destinationListItem)
{
//First check if the Source List has Unique permissions
if (sourceListItem.HasUniqueRoleAssignments)
{

//Break List permission inheritance first
destinationListItem.BreakRoleInheritance(true);
destinationListItem.Update();

//Remove current role assignemnts
while (destinationListItem.RoleAssignments.Count > 0)
{
destinationListItem.RoleAssignments.Remove(0);
}
destinationListItem.Update();

//Copy Role Assignments from source to destination list.
foreach (SPRoleAssignment sourceRoleAsg in sourceListItem.RoleAssignments)
{
SPRoleAssignment destinationRoleAsg = null;

//Get the source member object
SPPrincipal member = sourceRoleAsg.Member;

//Check if the member is a user
try
{
SPUser sourceUser = (SPUser)member;
SPUser destinationUser = destinationListItem.ParentList.ParentWeb.AllUsers[sourceUser.LoginName];
if (destinationUser != null)
{
destinationRoleAsg = new SPRoleAssignment(destinationUser);
}
}
catch
{ }

//Not a user, try check if the member is a Group
if (destinationRoleAsg == null)
{
//Check if the member is a group
try
{
SPGroup sourceGroup = (SPGroup)member;
SPGroup destinationGroup = destinationListItem.ParentList.ParentWeb.SiteGroups[sourceGroup.Name];
if (destinationGroup != null)
{
destinationRoleAsg = new SPRoleAssignment(destinationGroup);
}
}
catch
{ }
}

//At this state we should have the role assignment established either by user or group
if (destinationRoleAsg != null)
{

foreach (SPRoleDefinition sourceRoleDefinition in sourceRoleAsg.RoleDefinitionBindings)
{
try { destinationRoleAsg.RoleDefinitionBindings.Add(destinationListItem.ParentList.ParentWeb.RoleDefinitions[sourceRoleDefinition.Name]); }
catch { }
}

if (destinationRoleAsg.RoleDefinitionBindings.Count > 0)
{
//handle additon of an existing permission assignment error
try { destinationListItem.RoleAssignments.Add(destinationRoleAsg); }
catch (ArgumentException) { }
}

}

}

//Ensure item update metadata is not affected.
destinationListItem.SystemUpdate(false);
}
else
//No need to assign permissions
return;



}



 





Technorati Tags:

Drawbacks of current List Export and Import MOSS 2007 Content Migration APIs

 

[Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

Some Background.

Suppose you are considering using the SharePoint Content Migration API for copying your SharePoint content from one site to another site. Consider below tested scenarios on what you will actually get out of the API usage results  for your implementation.

When you are using SharePoint SPWeb Object as source to Export/Import API Process, you will get the SharePoint Permissions, SharePoint Users and the SharePoint Groups as well as Site Permissions. In additionally since all the List and list items are part of the larger scope of the SPWeb, you will also get the entire list and list item level permissions copied as well.

Other than security all of your metadata, date and time stamps is all retained. Hence you are covered for the most part. (Excluding any of your customizations, workflows, alerts etc. which you are responsible to appropriately make the transition).

I have experienced a lack of security data being transferred appropriately when it comes to Export/Importing anything  below the SPWeb object.

What security information does SharePoint List Export/Import API Process copy?

  • Copies only the Users/AD Groups  and the SharePoint Groups and their members.
Source Target
image image
Users
image
Users
image
Users are missing.
Permission Levels
image
Permission levels
image
Custom Permission Level(s) are missing
(Publishing Portal specific Permission Levels will also be missed if the target site collection is not a Publishing Site which is fine).

 

What security information does SharePoint List Export/Import API Process not copy?

  • Does not copy Web Level Permission Assignments.
  • Does not copy List/Lib Level Permission Assignments (Other than the OOTB Groups and OOTB Permission Levels).
  • Does not copy List/Lib Item/Folder Level Permission Assignments (Other than the OOTB Groups and permission levels).
Source Target
Web Level Permissions
image
Web level Permissions
image

All Web Level permission assignments are missing.
Library Level Permissions
image
Library Level Permissions
image

All Library Level permission assignments are missing.
Library Item Level Permissions
image
Library Item Level Permissions
image
All Library Item Level permission assignments are missing.

 

How to fix this gap?

See my other blog on How to programmatically copy Web Level, List level and List Item Level Security Information for SharePoint Sites

Thursday, April 7, 2011

SharePoint (MOSS 2007) Export and Import API specifics (PRIME)

 

[Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

[Also see my other blog on Drawbacks of current List Export and Import Content Migration APIs]

[Also see my other blog on Copying Web Level, List level and List Item Level Security Information]

Overview

There are several good references (refer to my main blog on SharePoint 2010 Migration) when it comes to SharePoint Export and Import API. I still wanted to call out as reference to all of the options that will need to be appropriately set to get your export and import going.

Export Specifics

Below I am listing all the possible Export Settings properties following by my comments:

Basic Object definition properties to be set:

SPExportObject exportObject = new SPExportObject();
exportObject.Id = web.ID;
exportObject.IncludeDescendants = SPIncludeDescendants.All;
exportObject.Type = SPDeploymentObjectType.Web;

 

Export Specific settings:

SPExportSettings settings = new SPExportSettings();

//Standard Options
settings.SiteUrl = web.Url;
settings.FileLocation = exportFolderPath;
string exportImportSubFolder = YourExportFolder;

 

Export Log File settings:
string exportLogFile = YourExportLogFilename;
if(System.IO.File.Exists(YourExportLogFilename))
{
    File.Delete(YourExportLogFilename);
}
settings.LogFilePath = YourExportLogFilename;


Export general Settings:
settings.FileCompression = false;
settings.ExcludeDependencies = true;
settings.OverwriteExistingDataFile = true;

Export events to be disabled Settings:
settings.HaltOnNonfatalError = false;
settings.HaltOnWarning = false;


Export configurable Settings:
settings.IncludeSecurity = SPIncludeSecurity.All;
settings.IncludeVersions = .All;
settings.CommandLineVerbose = true; 
settings.ExportMethod = SPExportMethodType.ExportAll;


Note this  settings if you want to mock-run the export:                   
settings.TestRun = true;  
                    
As best practice run the validate, your try catch can catch early error much before the Export is kicked off:
settings.Validate();


Final Export Run Settings:
settings.ExportObjects.Add(exportObject);
SPExport export = new SPExport(settings);
export.Run();

 

Import Specifics

Basic Object definition properties to be set:
SPImportSettings settings = new SPImportSettings();
settings.SiteUrl = YourNewSiteCollection.Url;
//Assign the importing parent web URL
settings.WebUrl = DestinationWebURL;

Set the file locations for where the exported data is located:
settings.FileLocation = YourImportFolderPath;

Set the Import log file name. Delete first if exists from he previous run, otherwise it will append and could get large:
if (System.IO.File.Exists(YourImportLogFile))
{
    File.Delete(YourImportLogFile);
}
settings.LogFilePath = YourImportLogFile;

No need to set compressed since we will be importing from an uncompressed export:
settings.FileCompression = false;

Set to false to be able reparent in a new site collection:
settings.RetainObjectIdentity = false; 

You want all the security to be included (See my other blog on .... ):
settings.IncludeSecurity = SPIncludeSecurity.All;

You want to set to all to import all user date time into:
settings.UserInfoDateTime = SPImportUserInfoDateTimeOption.ImportAll;

You want to supress the events to increase your import performance:
settings.HaltOnNonfatalError = false;
settings.HaltOnWarning = false;
settings.SuppressAfterEvents = false;

You want to get all the versions:
settings.UpdateVersions = SPUpdateVersions.Append;

Verbose log will generate detailed log while running:
settings.CommandLineVerbose = true;

As best practice run the validate, your try catch can catch early errors much before the Import is kicked off:
settings.Validate();

Final Import Run Settings:
SPImport import = new SPImport(settings);
import.Run();

How to programmatically lock SharePoint Site Collection while Exporting and other maintenance work

 

[Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

In order to ensure your export remains consistent through the process of Export and then till the completion of Import, consider Read Locking your Exporting Site Collection.  

Lets see how these  UI settings for the locks map to the code options:

SPSite.WriteLocked

image_thumb1

SPSite.ReadOnly

image_thumb2

SPSite.ReadLocked

image_thumb3

Below I am sharing my code base for implementing the ReadOnly locks. Few things to note:

  • When you are assigning any lock, it is important that you first assign the LockHint before setting the Lock as the site collection becomes the respective lock mode.
  • When you are removing any lock, it is important that you first setting the given Lock false. as the site collection becomes released from the respective lock mode. Then clear off the lock hint.

Your Call order

1. LockSiteCollection(SiteCollection);
2. Perform Your Export/Import here
3. UnLockSiteCollection(SiteCollection);

Common Methods you can use:

 

       private void LockSiteCollection(SPSite Site)
{
//Try lock the Portal Site Collection
try
{
Site.LockIssue = “Your Lock reason”;
}
//handle prior left out read only locks due to errors.
catch (UnauthorizedAccessException)
{
Site.ReadOnly = false;
Site.LockIssue = “Your Lock reason”;
}
Site.ReadOnly = true;

}


private void UnLockSiteCollection(SPSite Site)
{

Site.ReadOnly = false;
Site.ReadLocked = false;
Site.WriteLocked = false;
Site.LockIssue = "";
}




Technorati Tags:

Site Collection Creation and re-parenting

[Note : This post is part of  my main blog “Segregating Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

When you are importing a SPWeb or an SPWeb tree in to a new Site Collection, you will want to first create new site collection.  Now moving a given SPWeb as Root Web in to a new Site Collection is called re-parenting. From the above references you will get the more on this. But something I wanted to emphasize and clarify why we should create a new Site Collection via an STSADM  createsite as supposed to SPWebApplication.CreateSite().

SPWebApplication.CreateSite().

Result: 

  • Site Collection
  • -->Rootweb is created with given Template Type (STS/MPS/Custom)

What happens here: you can only import on top of the RootWeb. So the end result is that your imported web is always a sub web of the rootweb.

 

STSADM  createsite

Result: 

  • Site Collection
  • -->Rootweb is created with Blank Template Type (NOT STS/MPS/Custom)

What happens here: you can import in to the RootWeb. So the end result is that your imported web can be at rootweb level.

Tuesday, April 5, 2011

How to find SharePoint List and Libraries true Size?

 

Overview

I wanted to get size of list and libraries for my portal in a reporting fashion. For Publishing Portal site collection you can access the list and library size from site actions> under the Site Collection Administration group, you have access to “Storage space allocation”. This is great info. But not something I can consume the same information programmatically.

image 

If you notice this page is server through /_layouts/storman.aspx. Further digging on this page gives me the inherited  name space “Microsoft.SharePoint.ApplicationPages.StorMan” which is obfuscated and hence not sure how this above data is retrieved.

 

Further exploration

So still hoping to find the little nugget that is producing this information, I started diggings in to the Content Database Stored Procedures.

Then I came across two interesting stored procedures: proc_GetListSizes and proc_GetDocLibrarySizes

Both stored procedures take the Site Collection GUID. When test ran, the result was the same as the above Storage space allocation page.

 

So what I did with these stored procedures?

I took these two stored procedures, ran against the content database of my portal site collection. Merged the results of both stored procedures. Used some rudimentary caching  to store and retain the results set. Now you can list or get size for your given SPList object in MB.

Warning

This is not recommended against the production system as this is not a Microsoft Supported operation. Also be aware that the results could be huge and may impact your system.

Well here is all you are waiting for… the code base:

I have tested this against the MOSS 2007 and have not had chance to validate for any other SharePoint versions yet.

  • I have also included the Web Lists Size GetWebSizeWithUnitMeasure().
  • I have also included the Lists Size GetListSizeInBytes().
  • Then I have also included a function to determined the best way to represent the size in Bytes or KB or MB or GB or TB DefineSizeWithUnitMeasure

 

Your calls examples

To get size of a list:

string SizeWithUnitMeasure;
double webSizeInBytes = GetWebSizeWithUnitMeasure(web, out withUnitMeasure);
//Use the SizeWithUnitMeasure to print with the measure.


To get size of a web:


string listSizeWithMeasure;
double listSizeInBytes = GetListSizeWithUnit(list, out listSizeWithMeasure);
//Use the listSizeWithMeasureto print with the measure.



Common Function




 





public static double DefineSizeWithUnitMeasure(double sizeInBytes, out string unitMeasure)
{
unitMeasure = "Bytes";
double size = sizeInBytes;

if (size > 1024)
{
size = sizeInBytes / 1024d;//KB
unitMeasure = "KB";
}
if (size > 1024)
{
size = size / 1024d;//MB
unitMeasure = "MB";
}
if (size > 1024)
{
size = size / 1024d; //GB
unitMeasure = "GB";
}

if (size > 1024)
{
size = size / 1024d; //TB
unitMeasure = "TB";
}

return size;
}

public static double GetWebSizeWithUnitMeasure(SPWeb web, out string withUnitMeasure)
{

double storageUsage = 0d;


foreach (SPList list in web.Lists)
{
storageUsage += (double) GetListSizeInBytes(list);
}



string unitMeasure = "";
double webSize = DefineSizeWithUnitMeasure(storageUsage, out unitMeasure);

withUnitMeasure = string.Format("{0} {1}", webSize.ToString("f"), unitMeasure);

return storageUsage;
}





public static double GetListSizeWithUnit(SPList list, out string withUnitMeasure )
{
double listSizeinBytes = (double) GetListSizeInBytes(list);
string unitMeasure = "";
double listSize = DefineSizeWithUnitMeasure(listSizeinBytes, out unitMeasure);

withUnitMeasure=string.Format("{0} {1}", listSize.ToString("f"), unitMeasure);

return listSizeinBytes;
}



        public static long GetListSizeInBytes(SPList list)
{
long listSize = 0;

string filter = string.Format("tp_id='{0}'", list.ID);

DataTable myDataTable = GetCachedSiteCollectionListSizes(list.ParentWeb.Site);
DataRow[] dataRows = myDataTable.Select(filter);

if (dataRows.Length > 0)
{
listSize = (long)dataRows[0]["TotalSize"];
}

return listSize;
}



private static DataTable m_SiteCollectionListSizes;
private static Guid m_SiteCollectionListSizesSiteID;

private static DataTable GetCachedSiteCollectionListSizes(SPSite site)
{
if (m_SiteCollectionListSizes == null || m_SiteCollectionListSizesSiteID != site.ID)
{
m_SiteCollectionListSizes = GetSiteCollectionListSizes(site);
m_SiteCollectionListSizesSiteID = site.ID;
}

return m_SiteCollectionListSizes;

}

private static DataTable GetSiteCollectionListSizes(SPSite site)
{

DataTable dataTable = GetDocLibSizes(site);
//Combine both list and doc lib size results
dataTable.Merge(GetListSizes(site));

return dataTable;

}

private static DataTable GetDocLibSizes(SPSite site)
{

string connectionString = site.WebApplication.ContentDatabases[site.ContentDatabase.Id].DatabaseConnectionString;


           string storedProcName = "proc_GetDocLibrarySizes";

System.Data.SqlClient.SqlConnection connection = null;
System.Data.SqlClient.SqlDataReader reader = null;
DataTable dataTable = null;

try
{
connection = new System.Data.SqlClient.SqlConnection(connectionString);
connection.Open();

System.Data.SqlClient.SqlCommand command = new System.Data.SqlClient.SqlCommand(storedProcName, connection);
command.CommandType = CommandType.StoredProcedure;

command.Parameters.Add(new System.Data.SqlClient.SqlParameter("@SiteId", site.ID.ToString()));

reader = command.ExecuteReader();

dataTable = new DataTable();
dataTable.Load(reader);

}
finally
{
if (reader != null)
reader.Close();
if (connection != null)
connection.Close();
}
return dataTable;
}

private static DataTable GetListSizes(SPSite site)
{

string connectionString = site.WebApplication.ContentDatabases[site.ContentDatabase.Id].DatabaseConnectionString;
string storedProcName = "proc_GetListSizes";

System.Data.SqlClient.SqlConnection connection = null;
System.Data.SqlClient.SqlDataReader reader = null;
DataTable dataTable = null;

try
{
connection = new System.Data.SqlClient.SqlConnection(connectionString);
connection.Open();

System.Data.SqlClient.SqlCommand command = new System.Data.SqlClient.SqlCommand(storedProcName, connection);
command.CommandType = CommandType.StoredProcedure;

command.Parameters.Add(new System.Data.SqlClient.SqlParameter("@SiteId", site.ID.ToString()));

reader = command.ExecuteReader();

dataTable = new DataTable();
dataTable.Load(reader);

}
finally
{
if (reader != null)
reader.Close();
if (connection != null)
connection.Close();
}
return dataTable;
}



 





Technorati Tags:

Friday, March 25, 2011

Moving Large MOSS 2007 Sites and Content

 

[Note : This post is part of  my main blog on “SharePoint 2010 Migration”]

What is Moving Large Sites?

  • If your Site Collection is over 200GB and hence the underplaying dedicated content database should be over 200GB exceeding the Microsoft recommended Max size for Content Database.
  • The Content and Site segregation effort is to move out such data to separate Site Collections and in to a separate content database or an existing smaller database if you prefer, there by reducing the over all size of your larger content database.
  • The your site collection could have several Lists and Libraries with over 1000 items and over 2000 items per list or library and they might be continuing to grow.
  • The site collection may also has several Document libraries that are over several GB in sizes and may or may not fall under the above said categories of over 100o items.
  • You will need to analyze to understand where things are growing.  I will cover the steps for this in the Analysis section below.
  • Hence your design for moving such content requires a very flexible approach to pick and choose each given list, library and may be an entire site (SPWeb) to be moved to given new site collection or an existing site collection.
  • I call this moving selective content to separate sites as “Segregation”.

Design Considerations

  • You want to ensure that the segregated content in the new site collection carries all content, date and time stamps, metadata, pages and applicable settings, security information.
  • You want to your segregation approach to be able to handle each given scenario such as move a site, move a list or library.

Available tools and options

MOSS 2007 has following two options when it comes to moving/copying content:

  • Content Migration APIs (PRIME): More granular support (Up to item level), full fidelity.
  • STSADM export/import operations: Uses the above API but less granular (Web Level Only), less fidelity.
  • Content Deployment UI interface via the Central Admin interface, does not apply here as it is meant for deploying to a target site.

Design Approach

  • Based on the need to support the individual items consider implementing a data file so that you can feed the data file to your process.
  • The configuration data file should mention a give list , library or a site as Source Object to be moved.
  • For each such Source Object, a Target should be mentioned.

    • The target can be a  new or an existing site collection,

    • For existing site collection, the location within the site collection as where to be moved.

    • For a new site collection , the name, URL, type of site template will need to be determined.

  • For entire segregation process a default set of parameters should be motioned such as minimum version to be migrated such as last 10 versions. And Only on demand this parameter should able to be overridden for any given Source Object to be migrated.

  • The segregation export and import should support entire set of metadata, version history, file status as is (if checked out if possible) , user permissions configuration.

The segregation should also support consistent, as-is migration of data from source to target.

Design Constraints and Decisions

Implementation Approach:

 

Site Collection Creation and re-parenting see me other blog.
Locking the Exporting Site Collection see my other blog.
Export and Import API call specifics see my other blog.
Drawbacks of current List Export and Import Content Migration APIs see my other blog.
Hot to programmatically copy Web Level, List level and List Item Level Security Information see my other blog.

Thursday, March 24, 2011

SharePoint 2010 Migration

 

What am I blogging about?

Back in 2007 I did the MOSS 2007 migration with great success, that this year I am back to perform SharePoint 2010 migration, with several Web Applications worth TBs of data with several customizations. I want to use this blog as main page to start capturing major steps and tit-bits of  my findings.  I will try to capture the details in an order so that you can use it as a guidance for your migration.

Disclaimer

Whatever I can post here are general information relating to the topic and are provided as general motive of sharing with the community.

Steps in the order

Blow are major steps as I approach and for each I will be including sub activities and detailed blog for each such topic.

  1. Why to migrate to SPS 2010?
  2. Set up your development environments.
    • If you have very simple sites with no customizations, or if you already have your development/test environment then you are covered.
    • Otherwise your step step should be getting your development environments setup for both MOSS 2007 and SPS 2010 platforms.
  3. Simulate your web application(s).
    • If  your web application are running on SSL here is how to implement SSL in development, see my other blog here.
  4. Perform Analysis
  5. Preparation for Migration
  6. Migration Design (Coming soon…)
  7. Migration Implementation(Coming soon…)
  8. Migration Testing(Coming later…)
  9. Deployment(Coming later…)

What is pre Upgrade Checker report and what it is not

 

[This post is part of my main blog post on SharePoint 2010.]

The Pre Upgrade report captures three different levels of information:

  • Farm level information
    • OS Level
    • SharePoint Farm topology
  • Application level configurations
  • Content level State information

The context of Pre Upgrade report run is under the context of what if we were to perform an In place upgrade in the same farm MOSS 2007. 

If your MOSS 2007 farm OS is Win 2003 or 32bit,  you will see OSPrerequiste is failed. Since you are not going to be using the same farm and will not be doing the in-place upgrade this is attribute status is not applicable.

More to come…

Friday, April 16, 2010

Email links to InfoPath tries to open the form in Rich InfoPath Client as supposed to Browser based form.

 

Overview

For Nintex Approval Task, you will have emails text. Under the email you would add a link to respond to the task such that the link would open the InfoPath Form over the browser.

clip_image001

Issue

Email link to open the form, tries to opens the InfoPath form with rich client as supposed to Browser.

Diagnosis

I have noticed some flakiness with the link in the task email where the reference to “{Common:ItemUrl}” embed the link.

I have noticed in one of our build, clicking on the link tried to download the InfoPath file and open with the Rich InfoPath Client as supposed to open in browser. Below is the link.. and next to it is the HTML snippet of the email..

.xml">.xml">https://webmail.MyCompany.com/owa/redir.aspx?C=9255f52eea184a3ab087aadad9d7b0da&URL=http://<FileUrl>.xml

Email HTML

<P style="MARGIN: 0in; FONT-FAMILY: Arial; FONT-SIZE: 10pt">Click&nbsp;&nbsp;</FONT><property style="COLOR: blue; TEXT-DECORATION: underline" title=Hyperlink contentEditable=false refText="here" link="true" refLink="{Common:ItemUrl}">here</property>&nbsp;&nbsp;to review and respond to the task</P>

Below is the Working link:

.xml">.xml">https://webmail.MyCompany.com/owa/redir.aspx?C=9255f52eea184a3ab087aadad9d7b0da&URL=http://<FileUrl>.xml%26OpenIn%3dbrowser

HTML

<P style="MARGIN: 0in; FONT-FAMILY: Arial; FONT-SIZE: 10pt">Click&nbsp;&nbsp;</FONT><property style="COLOR: blue; TEXT-DECORATION: underline" title=Hyperlink contentEditable=false refText="here" link="true" refLink="{Common:ItemUrl}">here</property>&nbsp;&nbsp;to review and respond to the task</P>

As you can see I had highlighted in the working link the addition of “&OpenIn=Browser” is present at the end of the link. While you will also notice that the HTML snippet is the same.

Well its simple setting that is causing this issue. If you look over the Forms libraries Advanced Settings, you will see that Browser-enabled Documents is set to use the client application.

clip_image002

Simply switch this over to Display as Web Page.

clip_image003

Now any email notifications that get generated after this setting will add the “OpenIn=Browser” query parameter to the link url.

Thursday, April 15, 2010

Implementing workflows using MOSS 2007, Nintex Workflow 2007 and InfoPath Forms 2007

Overview

Lately I been busy building workflows using the Nintex Workflows 2007 on MOSS 2007 with InfoPath 2007 forms over Forms server. I have gone through my own share of learning and building some nice integrations. I have also come across several issues as they exist, some have answers some don’t. In the next several days I will be blogging on each of several of my thoughts over design, implementation and runtime related topics as below.  I will enable links as I get chance to post on each of these topics.

Topics

  • Design
  • Nintex
    • Best Practices for Nintex
    • Limitation of Nintex
    • What do I like about Nintex?
    • Integrating Nintex with InfoPath
    • Thinks that you will need to watch out while in Nintex Workflow Development.
    • Custom My Workflows web part for Nintex
    • Custom My Workflow Tasks web part for Nintex
    • Links in Nintex Workflow
    • Generating Custom Workflow ID
    • Updating InfoPath data from Nintex
    • Extracting File attachments from InfoPath submissions
    • Implementing Item security for other assets lists/libraries
    • Email links to InfoPath tries to open the form in Rich InfoPath Client as supposed to Browser based form.
  • InfoPath
    • How to exceed InfoPath 5 expression barrier from Rules?
    • How to show InfoPath data for debugging with a trick?
    • How to improve Custom People picker speed?
    • Custom InfoPath footer with version/build information.
    • The quirks in InfoPath over Forms Server.
    • What not to do?
  • SharePoint
    • Workflow timer frequency
    • Save conflict issue
    • How to secure Forms library
    • How to Assets List/Library
  • Implementation
    • Deploying InfoPath Forms
    • Deploying Nintex Workflows
  • Development
    • My Build Process
    • My Build Scripts
    • How to debug?

If you are interested in any of these topics please leave comment and I will blog on them.

Saturday, December 5, 2009

Installing SPD 2007 along with Office 2010 Beta

Issue

On my lap top I have Windows 7 x64 installed  and I run MOSS 2007+SQL 2008 (refer my other blog on how I got MOSS 2007 installed and running on my local Windows 7).  Next I went ahead to install Office 2010 x64 Beta as the bits became available. I got in to trouble with Office 2010 install complaining that previous version of Office programs were installed and that I must first uninstall in order to install Office 2010 and pointed to Office SharePoint Designer 2007.  This is documented and discussed issued with Office 2010. Well I still need SPD 2007 in order to work with MOSS 2007.

Solution

So I first took chance by first uninstalling SPD 2007  and continued with Office 2010 beta x64 install which worked pretty well.

Now I tried installing the SPD 2007 once again (there is only x86 version of SPD 2007), and it installed well and SPD 2007 is working as expected.

Follow up

On the above success next I am venturing in to installing SharePoint 2010 along side of my MOSS 2007 on my Windows 7 ( I know Microsoft says it is not supported, well MOSS 2007 install did the same until someone figured out why not?) , look for my next blog….

Installing MOSS 2007 on Windows 7

    Overview

    After Windows 7 was RTMed, I wanted to once again refresh my Lap Top with the released version as supposed to running the old RC version which would have timed out in July 2010. With the RC version I had the SQL Server 2008 and MOSS 2007 installed on my Windows 7 RC install. Worked very well.

    But this time once again I have to work through some of the challenges to get the MOSS install kick in. So I had followed all the steps as given in the below references, but wanted to add little more clarity to some of the steps and also share my steps that made it work finally.

    Also thanks to the Bamboo Solutions folks (Jonas Nilsson) for writing the program that fakes the MOSS installer that our Windows 7 really the server and let the install program continue. Randy Williamsfrom Synergy for sharing a small trick that did the one more trick that SharePoint.exe to be renamed back to setup.exe.

    Here goes my little contribution…

    References

    http://community.bamboosolutions.com/blogs/bambooteamblog/archive/2009/05/07/installing-wss-3-0-moss-sp2-on-windows-7-rc.aspx

    http://community.bamboosolutions.com/blogs/bambooteamblog/archive/2008/05/21/how-to-install-windows-sharepoint-services-3-0-sp1-on-vista-x64-x86.aspx

    http://www.synergyonline.com/blog/blog-moss/Lists/Posts/Post.aspx?ID=89

    Steps

    Follow all the steps in sequence as dictated by the first two links in my reference.

    Tip 1:

    In order to turn of the Application Compatibility, if you search “policy” in Windows 7 as below,

    image

    you will be search displayed as below…

    image

    Choose the “Edit group policy”, otherwise you can simply run “gpedit.msc

    Tip 2:

    If you have renamed the original “<your MOSS 2007 Install Folder>x64\setup.exe” to sharepoint.exe, I kept on getting the help menu for extract command, so when I tried renaming sharepoint.exe back to “setup.exe” in the Install got kick in…

    clip_image002

    Tip 3:

    Once all the above issues have been resolved, I was able to get the MOSS 2007 installed. Next I updated with latest SP 2 and the October 2009 CU, then ran config wizard and completed the Central Administration created.

    Now I fired up the Central Administration in a browser and found little strange display under the Topology and Services options. The “Services On Farm” was not displayed.

    clip_image003

    As second test I tried to create a New SSP…

    clip_image004

    and got below error…

    clip_image005

    After checking my Local Users Group>Administrators, my Windows 7 login account was already part of administrator group (which I had already set up before I started with SQL Server and MOSS install steps).

    Then I figure this must be something in the Windows 7 UAC that may have not set correctly. But there was no option to check any more details.

    Then I ventured in to Local Security Policy to see if there are any policies that are blocking my account though who is member of local Administrator to act partially and blocking my full access/permissions to above problem..

    Then I search for [not literally] “Can I turn off UAC?” and ended up with a nice guide from TechNet (http://technet.microsoft.com/en-us/library/cc709691(WS.10).aspx). Look for section “To disable Admin Approval Mode

    Now to run the Local Security Policy follow below steps:

    1. Click Start, click All Programs, click Accessories, click Run, type secpol.msc in the Open box, and then click OK.

    2. If the User Account Control dialog box appears, confirm that the action it displays is what you want, and then click Continue..

    3. From the Local Security Settings console tree, double-click Local Policies, and then double-click Security Options.

    4. Scroll down and double-click User Account Control: Run all administrators in Admin Approval Mode.

    image

    Select the Disabled option, and then click OK.

    image

    Close the Local Security Settings window.

    Now refreshed my CA and there you see all the missing options…

    clip_image006

    And further I was now able to create SSP and continue my journey running MOSS 2007 locally on Windows 7 happily….

    Next I want to install and test Office 2010 Beta but still want to run SPD 2007.. see my next post..

Tuesday, November 24, 2009

Custom 401 for MOSS 2007

Overview

  • Handling 401 Error under the WSS 3.0/MOSS 2007 requires special handling.
  • Unlike other Error messages that occur after the user is authenticated, all the browser requests handled through complete  ASP.Net pipe line and further WSS SPRequesthandler/SPPageFactory.
  • In case of FileNotFound there is actually an object model property under the SPApplication object, which is unique and easier to handle.

How MOSS presents 401 Error Message?

  • Under the context of hawkeye portal, were the users are restricted by means of an active directory group, users who are part of the group will get authenticated and are allowed to browse the portal.
  • Users who are part of WSGC domain but are not part of the restricted AD group, will not get authenticated.
  • Upon login attempt with proper username/password entry, user will be redirected http://<PortalURL>/_Layouts/accessdenied.aspx page.
  • Above page is the standard MOSS Access Denied Error Message page.

How does Authentication work in IIS and Browser?

  • When user points the browser to the hawkeye portal, the browser first sends out the request without the user credential.
  • IIS checks to see if the request site is security enabled.
  • If the site is security enabled (in hawkeye case it is Windows Authentication), then IIS challenges the browser to provide the credentials by means of sending a 401 status code back to browser.
  • Browser then looks at 401 and understands that it is now required to provide a credential.
  • Browser then looks up if it has user credentials.
    • If the IE setting is set to Login using current user login then the browser sends the current user name/password.
    • If the IE setting is set to Prompt for login then the users are prompted for login.
    • If the browser does not have a credential then the browser displays the 401 error message by looking up the local 401 error message (This is a page from IE client side)
  • When the browser provide a credential, IIS server takes on checking for the authentication for given user/password.
    • If the user is authenticated, the requested page is served.
    • If the user is a valid Active directory account then the SharePoint redirects the user to http://<PortalURL>/_Layouts/accessdenied.aspx page which is SharePoint error page.
    • If the user is not valid Active directory account then the client side IE 401 error message is displayed.

What 401 error condition can you handle in SharePoint?

  • You can only handle the 401 error condition for a valid AD user accounts but which do not have access to the portal. This is the only condition that Server handles.
  • You can not handle the 401 error condition for non valid AD user accounts.

What are the challenges in SharePoint with handling 401 error?

  • When the 401 error condition occurs for a valid AD user account  with no access to the site, SharePoint page handler will take over the call and manages to redirect to /_Layouts/accessdenied.aspx">/_Layouts/accessdenied.aspx">/_Layouts/accessdenied.aspx">http://<PortalURL>/_Layouts/accessdenied.aspx.
  • Above SharePoint behavior process ignores any web.confg <CustomErrors> settings.
  • The only way to intercept this redirection is by implementing a custom Http handler.
  • Under the http handler, by subscribing to EndRequest event and by trapping for the page redirect where the url is /_layouts/accessdeined.aspx, and then redirecting to your custom error page.
  • You can implement your own custom error page under the sharepoint context  at the given below URL location for example.
  • /_layouts/<YourCompanyName>/MyCustomAccessDeined.aspx

Implementation

Below is the sample code base for http module. Substitute your <Company Name>. Compile this into a signed assembly.

   1:  using System;


   2:  using System.Collections.Specialized;


   3:  using System.Configuration;


   4:  using System.Web;


   5:   


   6:   


   7:  namespace Rajesh.MOSS401Redirector


   8:  {


   9:      public class RedirectorHttpModule : IHttpModule


  10:      {


  11:        


  12:          public void Init(HttpApplication context)


  13:          {       


  14:              context.EndRequest += new EventHandler(context_EndRequest);                


  15:          }


  16:   


  17:          protected void context_EndRequest(object sender, EventArgs e)


  18:          {


  19:              if (sender is HttpApplication)


  20:              {


  21:                  HttpApplication application = (HttpApplication)sender;


  22:                 


  23:                   


  24:                  if application.Request.HttpMethod == "GET")


  25:        {


  26:                      HttpContext context = application.Context;


  27:                      if (context.Request.Url.ToString().ToLower().Contains("/_layouts/accessdenied.aspx"))


  28:                      {


  29:                          HttpContext.Current.Server.ClearError();


  30:                          HttpContext.Current.Response.Clear();


  31:                          HttpContext.Current.Response.Redirect("/_layouts/<YourCompanyName>/AccessDenied.aspx", false);


  32:                      }


  33:   


  34:                   }


  35:   


  36:           }


  37:    }




Deployment and configuration




  1. Deploy the redirector assembly Rajesh.MOSS401Redirector.dll to the GAC on all FEWs.


  2. Deploy your custom error page under C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\TEMPLATE\LAYOUTS\<YourCompanyName>\


  3. Add the following httpModule section to the web application web.config:



<configuration>

  <system.web>


    <httpModules>


      <add name="RedirectorHttpModule" type="Rajesh.MOSS401Redirector.RedirectorHttpModule, Rajesh.MOSS401Redirector, Version=1.0.0.0, Culture=neutral, PublicKeyToken=4f1f85ae373342d6" />


    </httpModules>


  </system.web>


</configuration>



     4.Test by login in with user with portal access, and user with no portal access.



     5. This test solution do not have proper implementation of the Custom Error page, it was meant to be a sample test only. You will need to implement proper supported SharePoint page.