Pages

Friday, May 6, 2011

How to open each file in new window in Excel 2010

 

Overview

You are using Excel 2010 (May for for Excel 2007 as well). You want to open each file in Excel as separate window. By default Excel 2010 (and Excel 2007) opens each file within the Excel Program. Excel 2010 and 2007 are designed as MDI applications.

Disclaimer

This is not Microsoft supported option. Although this has worked well for me, use caution as this approach includes on undocumented registry changes suggested by many other internet search results. It was confusing to try out all the options and get the result correct.  I wanted to share clearly given my situation what has worked.

Other Considerations

Being able to open each file in new window means being able to run new instance of Excel.exe program, that means you are using more memory. So use caution while trying to open more files when you have your excel program configured for this situation.

My Platform

  • Windows 7 x64 Ultimate (I Guess any editions might result with same behavior)

  • Office Professional Plus 2010 (I Guess any editions might result with same behavior)

    My Settings

    To make .xlsx file open in new window.

  • Open registering editing tool.

  • Navigate to “HKEY_CLASSES_ROOT\Excel.Sheet.12\shell\Open”

  • Right click on “Open” note and and export to file to save as backup.

  • Go to “Command” node.

  • Update the “Default” key to below value

    "C:\Program Files\Microsoft Office\Office14\EXCEL.EXE" /e "%1"


  • Rename the “Command” key below the “Default” key to “CommandOld”

  • Rename the “ddeexec”  Node to “ddeexecOld”

    You should see your registry changes similar to below pictures after change has been made:

    image

    image

    Now give it shot by open the “.xlsx”  files, you will see each file is opened in new window.

    Beware each Excel.xxxx  registry key is for each type of file that Excel as association with.

    For other type of extension based files fine below nodes and based on that I guess you can figure out your other type of files that you want to open to be configured the same.

    File Type to Open Registry sub key File Version
    For .xlsx Excel.sheet.12 Excel 2010 file
    For .xls Excel.sheet.8 Excel 2003-2007 file
    For .csv Excel.CSV Comma separated file

    Happy Excelling with more windows!

    Technorati Tags:
  • Friday, April 8, 2011

    How to programmatically copy Web Level, List level and List Item Level Security Information for SharePoint Sites

     

    [Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

    Some Background

    [See my other blog on Drawbacks of current List Export and Import MOSS 2007 Content Migration APIs]

    How to fix this gap?

    Well, there are SharePoint APIs you can use to copy all the necessary missing security information. But there needs to be an order to perform the process most efficiently. Follow below order of steps:

    • First you will be creating a new site collection, or you may already have your target importing site collection existing.
    • In any case before you perform your Import, you want to first copy over the Custom Permission Level.
    • Next you want to perform your usual Import with All security Included. (settings.IncludeSecurity = SPIncludeSecurity.All)
    • Above Import process will ensure to import all the Users and SharePoint Groups and any OOTB Permission Levels as applicable.
    • Now you want to Copy all the Web level Permission Assignments. Having all ready  copied the custom permission levels, this step will ensure all the permission assignments are appropriately copied over.
    • Next you can attend for each of the lists and libraries you have imported, copy the Permission Assignments.
    • Next you can attend for each Items in your lists and libraries you have imported, copy the Item Level Permission Assignments.

     

    Below are the methods for each of these above tasks:

    Your call order:

    1. PrepareNewSiteCollection();//Here either create the new site collection in preparation for Import, or simply establish your site collection SPSite object.

    2. CopyWebRoleAssignments(); //Here copy the web level permission levels.

    3. PerformListExportImporting(); //Here perform your List Export and Import.

    4. CopyWebRoleAssignments();//Here perform your Web level Permission Assignments copy.
    5. CopyListRoleAssignments(); //Here finally copy your List and List item level permissions.

     

    Common Methods:

            public static void CopyWebRoles(SPWeb sourceWeb, SPWeb destinationWeb)
    {

    //First copy Source Web Role Definitions to the Destination Web
    foreach (SPRoleDefinition roleDef in sourceWeb.RoleDefinitions)
    {
    //Skip WSS base permission levels
    if (roleDef.Type != SPRoleType.Administrator
    && roleDef.Type != SPRoleType.Contributor
    && roleDef.Type != SPRoleType.Guest
    && roleDef.Type != SPRoleType.Reader
    && roleDef.Type != SPRoleType.WebDesigner
    )
    {
    //handle additon of existing permission level error
    try { destinationWeb.RoleDefinitions.Add(roleDef); }
    catch (SPException) { }
    }
    }


    }


    public static void CopyWebRoleAssignments(SPWeb sourceWeb, SPWeb destinationWeb)
    {

    //Copy Role Assignments from source to destination web.
    foreach (SPRoleAssignment sourceRoleAsg in sourceWeb.RoleAssignments)
    {
    SPRoleAssignment destinationRoleAsg = null;

    //Get the source member object
    SPPrincipal member = sourceRoleAsg.Member;

    //Check if the member is a user
    try
    {
    SPUser sourceUser = (SPUser)member;
    SPUser destinationUser = destinationWeb.AllUsers[sourceUser.LoginName];
    if (destinationUser != null)
    {
    destinationRoleAsg = new SPRoleAssignment(destinationUser);
    }
    }
    catch
    { }

    if (destinationRoleAsg == null)
    {
    //Check if the member is a group
    try
    {
    SPGroup sourceGroup = (SPGroup)member;
    SPGroup destinationGroup = destinationWeb.SiteGroups[sourceGroup.Name];
    destinationRoleAsg = new SPRoleAssignment(destinationGroup);
    }
    catch
    { }
    }

    //At this state we should have the role assignment established either by user or group
    if (destinationRoleAsg != null)
    {

    foreach (SPRoleDefinition sourceRoleDefinition in sourceRoleAsg.RoleDefinitionBindings)
    {
    try { destinationRoleAsg.RoleDefinitionBindings.Add(destinationWeb.RoleDefinitions[sourceRoleDefinition.Name]); }
    catch { }
    }

    if (destinationRoleAsg.RoleDefinitionBindings.Count > 0)
    {
    //handle additon of an existing permission assignment error
    try { destinationWeb.RoleAssignments.Add(destinationRoleAsg); }
    catch (ArgumentException) { }
    }

    }

    }

    //Finally update the destination web
    destinationWeb.Update();

    }


    public static void CopyListRoleAssignments(SPList sourceList, SPList destinationList)
    {
    //First check if the Source List has Unique permissions
    if (sourceList.HasUniqueRoleAssignments)
    {

    //Break List permission inheritance first
    destinationList.BreakRoleInheritance(true);

    //Remove current role assignemnts
    while (destinationList.RoleAssignments.Count > 0)
    {
    destinationList.RoleAssignments.Remove(0);
    }


    //Copy Role Assignments from source to destination list.
    foreach (SPRoleAssignment sourceRoleAsg in sourceList.RoleAssignments)
    {
    SPRoleAssignment destinationRoleAsg = null;

    //Get the source member object
    SPPrincipal member = sourceRoleAsg.Member;

    //Check if the member is a user
    try
    {
    SPUser sourceUser = (SPUser)member;
    SPUser destinationUser = destinationList.ParentWeb.Users.GetByEmail(sourceUser.Email);
    destinationRoleAsg = new SPRoleAssignment(destinationUser);
    }
    catch
    { }

    if (destinationRoleAsg == null)
    {
    //Check if the member is a group
    try
    {
    SPGroup sourceGroup = (SPGroup)member;
    SPGroup destinationGroup = destinationList.ParentWeb.SiteGroups[sourceGroup.Name];
    destinationRoleAsg = new SPRoleAssignment(destinationGroup);
    }
    catch
    { }
    }

    //At this state we should have the role assignment established either by user or group
    if (destinationRoleAsg != null)
    {

    foreach (SPRoleDefinition sourceRoleDefinition in sourceRoleAsg.RoleDefinitionBindings)
    {
    try { destinationRoleAsg.RoleDefinitionBindings.Add(destinationList.ParentWeb.RoleDefinitions[sourceRoleDefinition.Name]); }
    catch { }
    }

    if (destinationRoleAsg.RoleDefinitionBindings.Count > 0)
    {
    //handle additon of an existing permission assignment error
    try { destinationList.RoleAssignments.Add(destinationRoleAsg); }
    catch (ArgumentException) { }
    }

    }

    }

    //Does not require list update
    //destinationList.Update();
    }
    else
    //No need to assign permissions
    return;



    }

    public static void CopyListItemsRoleAssignments(SPList sourceList, SPList destinationList)
    {
    foreach (SPListItem sourceListitem in sourceList.Items)
    {
    CopyListItemRoleAssignments(sourceListitem, destinationList.GetItemById(sourceListitem.ID));
    }

    }
    public static void CopyListItemRoleAssignments(SPListItem sourceListItem, SPListItem destinationListItem)
    {
    //First check if the Source List has Unique permissions
    if (sourceListItem.HasUniqueRoleAssignments)
    {

    //Break List permission inheritance first
    destinationListItem.BreakRoleInheritance(true);
    destinationListItem.Update();

    //Remove current role assignemnts
    while (destinationListItem.RoleAssignments.Count > 0)
    {
    destinationListItem.RoleAssignments.Remove(0);
    }
    destinationListItem.Update();

    //Copy Role Assignments from source to destination list.
    foreach (SPRoleAssignment sourceRoleAsg in sourceListItem.RoleAssignments)
    {
    SPRoleAssignment destinationRoleAsg = null;

    //Get the source member object
    SPPrincipal member = sourceRoleAsg.Member;

    //Check if the member is a user
    try
    {
    SPUser sourceUser = (SPUser)member;
    SPUser destinationUser = destinationListItem.ParentList.ParentWeb.AllUsers[sourceUser.LoginName];
    if (destinationUser != null)
    {
    destinationRoleAsg = new SPRoleAssignment(destinationUser);
    }
    }
    catch
    { }

    //Not a user, try check if the member is a Group
    if (destinationRoleAsg == null)
    {
    //Check if the member is a group
    try
    {
    SPGroup sourceGroup = (SPGroup)member;
    SPGroup destinationGroup = destinationListItem.ParentList.ParentWeb.SiteGroups[sourceGroup.Name];
    if (destinationGroup != null)
    {
    destinationRoleAsg = new SPRoleAssignment(destinationGroup);
    }
    }
    catch
    { }
    }

    //At this state we should have the role assignment established either by user or group
    if (destinationRoleAsg != null)
    {

    foreach (SPRoleDefinition sourceRoleDefinition in sourceRoleAsg.RoleDefinitionBindings)
    {
    try { destinationRoleAsg.RoleDefinitionBindings.Add(destinationListItem.ParentList.ParentWeb.RoleDefinitions[sourceRoleDefinition.Name]); }
    catch { }
    }

    if (destinationRoleAsg.RoleDefinitionBindings.Count > 0)
    {
    //handle additon of an existing permission assignment error
    try { destinationListItem.RoleAssignments.Add(destinationRoleAsg); }
    catch (ArgumentException) { }
    }

    }

    }

    //Ensure item update metadata is not affected.
    destinationListItem.SystemUpdate(false);
    }
    else
    //No need to assign permissions
    return;



    }



     





    Technorati Tags:

    Drawbacks of current List Export and Import MOSS 2007 Content Migration APIs

     

    [Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

    Some Background.

    Suppose you are considering using the SharePoint Content Migration API for copying your SharePoint content from one site to another site. Consider below tested scenarios on what you will actually get out of the API usage results  for your implementation.

    When you are using SharePoint SPWeb Object as source to Export/Import API Process, you will get the SharePoint Permissions, SharePoint Users and the SharePoint Groups as well as Site Permissions. In additionally since all the List and list items are part of the larger scope of the SPWeb, you will also get the entire list and list item level permissions copied as well.

    Other than security all of your metadata, date and time stamps is all retained. Hence you are covered for the most part. (Excluding any of your customizations, workflows, alerts etc. which you are responsible to appropriately make the transition).

    I have experienced a lack of security data being transferred appropriately when it comes to Export/Importing anything  below the SPWeb object.

    What security information does SharePoint List Export/Import API Process copy?

    • Copies only the Users/AD Groups  and the SharePoint Groups and their members.
    Source Target
    image image
    Users
    image
    Users
    image
    Users are missing.
    Permission Levels
    image
    Permission levels
    image
    Custom Permission Level(s) are missing
    (Publishing Portal specific Permission Levels will also be missed if the target site collection is not a Publishing Site which is fine).

     

    What security information does SharePoint List Export/Import API Process not copy?

    • Does not copy Web Level Permission Assignments.
    • Does not copy List/Lib Level Permission Assignments (Other than the OOTB Groups and OOTB Permission Levels).
    • Does not copy List/Lib Item/Folder Level Permission Assignments (Other than the OOTB Groups and permission levels).
    Source Target
    Web Level Permissions
    image
    Web level Permissions
    image

    All Web Level permission assignments are missing.
    Library Level Permissions
    image
    Library Level Permissions
    image

    All Library Level permission assignments are missing.
    Library Item Level Permissions
    image
    Library Item Level Permissions
    image
    All Library Item Level permission assignments are missing.

     

    How to fix this gap?

    See my other blog on How to programmatically copy Web Level, List level and List Item Level Security Information for SharePoint Sites

    Thursday, April 7, 2011

    SharePoint (MOSS 2007) Export and Import API specifics (PRIME)

     

    [Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

    [Also see my other blog on Drawbacks of current List Export and Import Content Migration APIs]

    [Also see my other blog on Copying Web Level, List level and List Item Level Security Information]

    Overview

    There are several good references (refer to my main blog on SharePoint 2010 Migration) when it comes to SharePoint Export and Import API. I still wanted to call out as reference to all of the options that will need to be appropriately set to get your export and import going.

    Export Specifics

    Below I am listing all the possible Export Settings properties following by my comments:

    Basic Object definition properties to be set:

    SPExportObject exportObject = new SPExportObject();
    exportObject.Id = web.ID;
    exportObject.IncludeDescendants = SPIncludeDescendants.All;
    exportObject.Type = SPDeploymentObjectType.Web;

     

    Export Specific settings:

    SPExportSettings settings = new SPExportSettings();

    //Standard Options
    settings.SiteUrl = web.Url;
    settings.FileLocation = exportFolderPath;
    string exportImportSubFolder = YourExportFolder;

     

    Export Log File settings:
    string exportLogFile = YourExportLogFilename;
    if(System.IO.File.Exists(YourExportLogFilename))
    {
        File.Delete(YourExportLogFilename);
    }
    settings.LogFilePath = YourExportLogFilename;


    Export general Settings:
    settings.FileCompression = false;
    settings.ExcludeDependencies = true;
    settings.OverwriteExistingDataFile = true;

    Export events to be disabled Settings:
    settings.HaltOnNonfatalError = false;
    settings.HaltOnWarning = false;


    Export configurable Settings:
    settings.IncludeSecurity = SPIncludeSecurity.All;
    settings.IncludeVersions = .All;
    settings.CommandLineVerbose = true; 
    settings.ExportMethod = SPExportMethodType.ExportAll;


    Note this  settings if you want to mock-run the export:                   
    settings.TestRun = true;  
                        
    As best practice run the validate, your try catch can catch early error much before the Export is kicked off:
    settings.Validate();


    Final Export Run Settings:
    settings.ExportObjects.Add(exportObject);
    SPExport export = new SPExport(settings);
    export.Run();

     

    Import Specifics

    Basic Object definition properties to be set:
    SPImportSettings settings = new SPImportSettings();
    settings.SiteUrl = YourNewSiteCollection.Url;
    //Assign the importing parent web URL
    settings.WebUrl = DestinationWebURL;

    Set the file locations for where the exported data is located:
    settings.FileLocation = YourImportFolderPath;

    Set the Import log file name. Delete first if exists from he previous run, otherwise it will append and could get large:
    if (System.IO.File.Exists(YourImportLogFile))
    {
        File.Delete(YourImportLogFile);
    }
    settings.LogFilePath = YourImportLogFile;

    No need to set compressed since we will be importing from an uncompressed export:
    settings.FileCompression = false;

    Set to false to be able reparent in a new site collection:
    settings.RetainObjectIdentity = false; 

    You want all the security to be included (See my other blog on .... ):
    settings.IncludeSecurity = SPIncludeSecurity.All;

    You want to set to all to import all user date time into:
    settings.UserInfoDateTime = SPImportUserInfoDateTimeOption.ImportAll;

    You want to supress the events to increase your import performance:
    settings.HaltOnNonfatalError = false;
    settings.HaltOnWarning = false;
    settings.SuppressAfterEvents = false;

    You want to get all the versions:
    settings.UpdateVersions = SPUpdateVersions.Append;

    Verbose log will generate detailed log while running:
    settings.CommandLineVerbose = true;

    As best practice run the validate, your try catch can catch early errors much before the Import is kicked off:
    settings.Validate();

    Final Import Run Settings:
    SPImport import = new SPImport(settings);
    import.Run();

    How to programmatically lock SharePoint Site Collection while Exporting and other maintenance work

     

    [Note : This post is part of  my main blog “Moving Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

    In order to ensure your export remains consistent through the process of Export and then till the completion of Import, consider Read Locking your Exporting Site Collection.  

    Lets see how these  UI settings for the locks map to the code options:

    SPSite.WriteLocked

    image_thumb1

    SPSite.ReadOnly

    image_thumb2

    SPSite.ReadLocked

    image_thumb3

    Below I am sharing my code base for implementing the ReadOnly locks. Few things to note:

    • When you are assigning any lock, it is important that you first assign the LockHint before setting the Lock as the site collection becomes the respective lock mode.
    • When you are removing any lock, it is important that you first setting the given Lock false. as the site collection becomes released from the respective lock mode. Then clear off the lock hint.

    Your Call order

    1. LockSiteCollection(SiteCollection);
    2. Perform Your Export/Import here
    3. UnLockSiteCollection(SiteCollection);

    Common Methods you can use:

     

           private void LockSiteCollection(SPSite Site)
    {
    //Try lock the Portal Site Collection
    try
    {
    Site.LockIssue = “Your Lock reason”;
    }
    //handle prior left out read only locks due to errors.
    catch (UnauthorizedAccessException)
    {
    Site.ReadOnly = false;
    Site.LockIssue = “Your Lock reason”;
    }
    Site.ReadOnly = true;

    }


    private void UnLockSiteCollection(SPSite Site)
    {

    Site.ReadOnly = false;
    Site.ReadLocked = false;
    Site.WriteLocked = false;
    Site.LockIssue = "";
    }




    Technorati Tags:

    Site Collection Creation and re-parenting

    [Note : This post is part of  my main blog “Segregating Large MOSS 2007 Sites” and “SharePoint 2010 Migration”]

    When you are importing a SPWeb or an SPWeb tree in to a new Site Collection, you will want to first create new site collection.  Now moving a given SPWeb as Root Web in to a new Site Collection is called re-parenting. From the above references you will get the more on this. But something I wanted to emphasize and clarify why we should create a new Site Collection via an STSADM  createsite as supposed to SPWebApplication.CreateSite().

    SPWebApplication.CreateSite().

    Result: 

    • Site Collection
    • -->Rootweb is created with given Template Type (STS/MPS/Custom)

    What happens here: you can only import on top of the RootWeb. So the end result is that your imported web is always a sub web of the rootweb.

     

    STSADM  createsite

    Result: 

    • Site Collection
    • -->Rootweb is created with Blank Template Type (NOT STS/MPS/Custom)

    What happens here: you can import in to the RootWeb. So the end result is that your imported web can be at rootweb level.

    Tuesday, April 5, 2011

    How to find SharePoint List and Libraries true Size?

     

    Overview

    I wanted to get size of list and libraries for my portal in a reporting fashion. For Publishing Portal site collection you can access the list and library size from site actions> under the Site Collection Administration group, you have access to “Storage space allocation”. This is great info. But not something I can consume the same information programmatically.

    image 

    If you notice this page is server through /_layouts/storman.aspx. Further digging on this page gives me the inherited  name space “Microsoft.SharePoint.ApplicationPages.StorMan” which is obfuscated and hence not sure how this above data is retrieved.

     

    Further exploration

    So still hoping to find the little nugget that is producing this information, I started diggings in to the Content Database Stored Procedures.

    Then I came across two interesting stored procedures: proc_GetListSizes and proc_GetDocLibrarySizes

    Both stored procedures take the Site Collection GUID. When test ran, the result was the same as the above Storage space allocation page.

     

    So what I did with these stored procedures?

    I took these two stored procedures, ran against the content database of my portal site collection. Merged the results of both stored procedures. Used some rudimentary caching  to store and retain the results set. Now you can list or get size for your given SPList object in MB.

    Warning

    This is not recommended against the production system as this is not a Microsoft Supported operation. Also be aware that the results could be huge and may impact your system.

    Well here is all you are waiting for… the code base:

    I have tested this against the MOSS 2007 and have not had chance to validate for any other SharePoint versions yet.

    • I have also included the Web Lists Size GetWebSizeWithUnitMeasure().
    • I have also included the Lists Size GetListSizeInBytes().
    • Then I have also included a function to determined the best way to represent the size in Bytes or KB or MB or GB or TB DefineSizeWithUnitMeasure

     

    Your calls examples

    To get size of a list:

    string SizeWithUnitMeasure;
    double webSizeInBytes = GetWebSizeWithUnitMeasure(web, out withUnitMeasure);
    //Use the SizeWithUnitMeasure to print with the measure.


    To get size of a web:


    string listSizeWithMeasure;
    double listSizeInBytes = GetListSizeWithUnit(list, out listSizeWithMeasure);
    //Use the listSizeWithMeasureto print with the measure.



    Common Function




     





    public static double DefineSizeWithUnitMeasure(double sizeInBytes, out string unitMeasure)
    {
    unitMeasure = "Bytes";
    double size = sizeInBytes;

    if (size > 1024)
    {
    size = sizeInBytes / 1024d;//KB
    unitMeasure = "KB";
    }
    if (size > 1024)
    {
    size = size / 1024d;//MB
    unitMeasure = "MB";
    }
    if (size > 1024)
    {
    size = size / 1024d; //GB
    unitMeasure = "GB";
    }

    if (size > 1024)
    {
    size = size / 1024d; //TB
    unitMeasure = "TB";
    }

    return size;
    }

    public static double GetWebSizeWithUnitMeasure(SPWeb web, out string withUnitMeasure)
    {

    double storageUsage = 0d;


    foreach (SPList list in web.Lists)
    {
    storageUsage += (double) GetListSizeInBytes(list);
    }



    string unitMeasure = "";
    double webSize = DefineSizeWithUnitMeasure(storageUsage, out unitMeasure);

    withUnitMeasure = string.Format("{0} {1}", webSize.ToString("f"), unitMeasure);

    return storageUsage;
    }





    public static double GetListSizeWithUnit(SPList list, out string withUnitMeasure )
    {
    double listSizeinBytes = (double) GetListSizeInBytes(list);
    string unitMeasure = "";
    double listSize = DefineSizeWithUnitMeasure(listSizeinBytes, out unitMeasure);

    withUnitMeasure=string.Format("{0} {1}", listSize.ToString("f"), unitMeasure);

    return listSizeinBytes;
    }



            public static long GetListSizeInBytes(SPList list)
    {
    long listSize = 0;

    string filter = string.Format("tp_id='{0}'", list.ID);

    DataTable myDataTable = GetCachedSiteCollectionListSizes(list.ParentWeb.Site);
    DataRow[] dataRows = myDataTable.Select(filter);

    if (dataRows.Length > 0)
    {
    listSize = (long)dataRows[0]["TotalSize"];
    }

    return listSize;
    }



    private static DataTable m_SiteCollectionListSizes;
    private static Guid m_SiteCollectionListSizesSiteID;

    private static DataTable GetCachedSiteCollectionListSizes(SPSite site)
    {
    if (m_SiteCollectionListSizes == null || m_SiteCollectionListSizesSiteID != site.ID)
    {
    m_SiteCollectionListSizes = GetSiteCollectionListSizes(site);
    m_SiteCollectionListSizesSiteID = site.ID;
    }

    return m_SiteCollectionListSizes;

    }

    private static DataTable GetSiteCollectionListSizes(SPSite site)
    {

    DataTable dataTable = GetDocLibSizes(site);
    //Combine both list and doc lib size results
    dataTable.Merge(GetListSizes(site));

    return dataTable;

    }

    private static DataTable GetDocLibSizes(SPSite site)
    {

    string connectionString = site.WebApplication.ContentDatabases[site.ContentDatabase.Id].DatabaseConnectionString;


               string storedProcName = "proc_GetDocLibrarySizes";

    System.Data.SqlClient.SqlConnection connection = null;
    System.Data.SqlClient.SqlDataReader reader = null;
    DataTable dataTable = null;

    try
    {
    connection = new System.Data.SqlClient.SqlConnection(connectionString);
    connection.Open();

    System.Data.SqlClient.SqlCommand command = new System.Data.SqlClient.SqlCommand(storedProcName, connection);
    command.CommandType = CommandType.StoredProcedure;

    command.Parameters.Add(new System.Data.SqlClient.SqlParameter("@SiteId", site.ID.ToString()));

    reader = command.ExecuteReader();

    dataTable = new DataTable();
    dataTable.Load(reader);

    }
    finally
    {
    if (reader != null)
    reader.Close();
    if (connection != null)
    connection.Close();
    }
    return dataTable;
    }

    private static DataTable GetListSizes(SPSite site)
    {

    string connectionString = site.WebApplication.ContentDatabases[site.ContentDatabase.Id].DatabaseConnectionString;
    string storedProcName = "proc_GetListSizes";

    System.Data.SqlClient.SqlConnection connection = null;
    System.Data.SqlClient.SqlDataReader reader = null;
    DataTable dataTable = null;

    try
    {
    connection = new System.Data.SqlClient.SqlConnection(connectionString);
    connection.Open();

    System.Data.SqlClient.SqlCommand command = new System.Data.SqlClient.SqlCommand(storedProcName, connection);
    command.CommandType = CommandType.StoredProcedure;

    command.Parameters.Add(new System.Data.SqlClient.SqlParameter("@SiteId", site.ID.ToString()));

    reader = command.ExecuteReader();

    dataTable = new DataTable();
    dataTable.Load(reader);

    }
    finally
    {
    if (reader != null)
    reader.Close();
    if (connection != null)
    connection.Close();
    }
    return dataTable;
    }



     





    Technorati Tags:

    Friday, March 25, 2011

    Moving Large MOSS 2007 Sites and Content

     

    [Note : This post is part of  my main blog on “SharePoint 2010 Migration”]

    What is Moving Large Sites?

    • If your Site Collection is over 200GB and hence the underplaying dedicated content database should be over 200GB exceeding the Microsoft recommended Max size for Content Database.
    • The Content and Site segregation effort is to move out such data to separate Site Collections and in to a separate content database or an existing smaller database if you prefer, there by reducing the over all size of your larger content database.
    • The your site collection could have several Lists and Libraries with over 1000 items and over 2000 items per list or library and they might be continuing to grow.
    • The site collection may also has several Document libraries that are over several GB in sizes and may or may not fall under the above said categories of over 100o items.
    • You will need to analyze to understand where things are growing.  I will cover the steps for this in the Analysis section below.
    • Hence your design for moving such content requires a very flexible approach to pick and choose each given list, library and may be an entire site (SPWeb) to be moved to given new site collection or an existing site collection.
    • I call this moving selective content to separate sites as “Segregation”.

    Design Considerations

    • You want to ensure that the segregated content in the new site collection carries all content, date and time stamps, metadata, pages and applicable settings, security information.
    • You want to your segregation approach to be able to handle each given scenario such as move a site, move a list or library.

    Available tools and options

    MOSS 2007 has following two options when it comes to moving/copying content:

    • Content Migration APIs (PRIME): More granular support (Up to item level), full fidelity.
    • STSADM export/import operations: Uses the above API but less granular (Web Level Only), less fidelity.
    • Content Deployment UI interface via the Central Admin interface, does not apply here as it is meant for deploying to a target site.

    Design Approach

    • Based on the need to support the individual items consider implementing a data file so that you can feed the data file to your process.
    • The configuration data file should mention a give list , library or a site as Source Object to be moved.
    • For each such Source Object, a Target should be mentioned.

      • The target can be a  new or an existing site collection,

      • For existing site collection, the location within the site collection as where to be moved.

      • For a new site collection , the name, URL, type of site template will need to be determined.

    • For entire segregation process a default set of parameters should be motioned such as minimum version to be migrated such as last 10 versions. And Only on demand this parameter should able to be overridden for any given Source Object to be migrated.

    • The segregation export and import should support entire set of metadata, version history, file status as is (if checked out if possible) , user permissions configuration.

    The segregation should also support consistent, as-is migration of data from source to target.

    Design Constraints and Decisions

    Implementation Approach:

     

    Site Collection Creation and re-parenting see me other blog.
    Locking the Exporting Site Collection see my other blog.
    Export and Import API call specifics see my other blog.
    Drawbacks of current List Export and Import Content Migration APIs see my other blog.
    Hot to programmatically copy Web Level, List level and List Item Level Security Information see my other blog.

    Thursday, March 24, 2011

    SharePoint 2010 Migration

     

    What am I blogging about?

    Back in 2007 I did the MOSS 2007 migration with great success, that this year I am back to perform SharePoint 2010 migration, with several Web Applications worth TBs of data with several customizations. I want to use this blog as main page to start capturing major steps and tit-bits of  my findings.  I will try to capture the details in an order so that you can use it as a guidance for your migration.

    Disclaimer

    Whatever I can post here are general information relating to the topic and are provided as general motive of sharing with the community.

    Steps in the order

    Blow are major steps as I approach and for each I will be including sub activities and detailed blog for each such topic.

    1. Why to migrate to SPS 2010?
    2. Set up your development environments.
      • If you have very simple sites with no customizations, or if you already have your development/test environment then you are covered.
      • Otherwise your step step should be getting your development environments setup for both MOSS 2007 and SPS 2010 platforms.
    3. Simulate your web application(s).
      • If  your web application are running on SSL here is how to implement SSL in development, see my other blog here.
    4. Perform Analysis
    5. Preparation for Migration
    6. Migration Design (Coming soon…)
    7. Migration Implementation(Coming soon…)
    8. Migration Testing(Coming later…)
    9. Deployment(Coming later…)

    What is pre Upgrade Checker report and what it is not

     

    [This post is part of my main blog post on SharePoint 2010.]

    The Pre Upgrade report captures three different levels of information:

    • Farm level information
      • OS Level
      • SharePoint Farm topology
    • Application level configurations
    • Content level State information

    The context of Pre Upgrade report run is under the context of what if we were to perform an In place upgrade in the same farm MOSS 2007. 

    If your MOSS 2007 farm OS is Win 2003 or 32bit,  you will see OSPrerequiste is failed. Since you are not going to be using the same farm and will not be doing the in-place upgrade this is attribute status is not applicable.

    More to come…

    Developing Sandboxed Solution in SPS 2010 Tips:

     

    Here I will be collecting all the relevant information that would help us deal with development and deployment of  Sandboxed Solution for SharePoint 2010.

    Design
    • Ensure the API/Class you intent use are available in Sandboxed solutions supported framework: Check the specific API/Class in MSDN. Note that MSDN has specific flag to denote if the specific API is available in Sandboxed solution.

                        image

     

     

    Development
    • In order to ensure proper development copy the SharePoint dll locally only during development, later take it out to deploy….
    Deployment
    • More to come…

    Tuesday, March 22, 2011

    Implementing SSL Certificate for IIS Sites in Windows 2003 Server for Development Purpose

     

    Scenario

    You have Windows 2003 server (x86 or x64) with IIS 6.0, this is your development server, you have to test several of your web sites with SSL and all of these web sites happen to be on same domain name such  as site1.mydomain.com, site2.mydomain.com, site3.mydomain.com …..

    Disclaimer

    I am not a Security/Certificate expert, but in this below blog I want to share my steps that have worked very well for the purpose of development to implement SSL certificates using SelfCert tool.

    What can you do?

    Instead of assigning individual certificates you can choose to implement wildcard certificates. such as getting certificate created for *.mydomain.com.

    You can use SelfCert utility which is part of IIS 6.0  resource Kit tools.

    You can use SSL Diagnostic utility which is part of IIS  6.0 resource Kit to finally validate the certificate assignments to your site.

    How to implement wildcard Self Certificate in Windows 2003 server with IIS 6.0?
    High level steps:
    1. In IIS create all your web sites (Site1,site2,site3….)
    2. Use Selfssl to create wildcard certificate and assign the certificate to given existing web site on your IIS.
    3. Then export the certificate along with private key to .pfx file.
    4. From your local machine certificate store, import the certificate in to Trusted Root Certificate Authority.
    5. From IIS, for each of your Site, assign your newly imported wildcard certificate.
    6. From command prompt assign securebindings for each of your site.
    7. From your client Browser, add your wildcard domain name as trusted site.

     

    Step 1: Create IIS Sites

    Create your IIS Sites from the IIS first. You don’t have to assign host headers at this time. We will cover the host headers later.

    Step 2: Create Certificate

    First download and install the IIS Resource Kit.  The utilities you choose will be installed at <SystemDrive>:\Program files[(x86)]\IIS Resources\

    As first step lets first create a wildcard certificate for your domain name “*.mydomain.com” for example. 

    We will use the selfssl tool to create a wildcard certificate against any one of the IIS site. (We will apply the certificate to all sites later). Go to command prompt and navigate to the folder where the Selfssl utility is stored. You can try SelfSsl /? to see all the options available.  Follow below syntax:

    <SystemDrive>:\Program files[(x86)]\IIS Resources\>selfssl.exe /N:CN=*.<mydomain.com> /S:<IISSiteID>

    Where substitute <mydomain.com> with your common domain  name, and for <IISSiteID> substitute IIS Site ID as displayed in the IIS manager Identifier column. (Default site is always 1, user created sites will have large number)

    image

    Above step creates an wildcard  certificate against your given site and stores the certificate under the Local Computer >Personal store.

    Lets check this out:

    From the command prompt run MMC. Add snap-in for Certificates, and choose My Computer.

    image

     

    image

    Now navigate to the Console Root>Certificates (Local Computer)>Personal>Certificates. Here you should find your newly created wild card certificate along with the Machine Certificate and any other personal certificates.

    image

    image

     

    Step 3: Export the certificate

    Now lets export this certificate  out along with the private key, so that in the next step we can import in to the Trusted Root Certificate.

    You can export this certificate from two different spots. Either from the Certificate management console that we opened in the above step to verify or from the IIS manager over the IIS site where we had created the wildcard certificate against.

    From either spot you can launch the Certificate Export wizard.

    From Certificate management console, you can choose your wild card certificate, right click, All Tasks, Export.

    image

    Or from IIS, right click over your site that you have used for selfcert command, Directory Security,  Click on Server Certificate and choose “Export the current certificate to a .pfx file.

    image

    Choose a file name and local folder where you want to exported certificate to be saved.  In the next step, provide password to protect the Certificate as you are also exporting the Private Key.

    image

    Step 4: Import certificate to Trusted Root Certificate Authority.

    Now switch back to the Certificates management console. Navigate to the node “Trusted Root Certification Authorities>Certificates”.

    image

    From the Certificates node, right click and choose to import certificate.

    image

    Now choose the above saved certificate file to import, enter the password entered above to encrypt the certificate file.

    Next choose to import the certificate as below.

    image

    Next confirm a successful import.

    image

    Step 5: Assign wildcard certificate from IIS

    Now switch to IIS MMC. For each of your IIS sites, assign the newly acquired selfcert generated certificate. Leave the SSL port as is and the default port as is. The sites may be in stopped state and that is fine at this stage, since all of the sites are using the same default port 80.

    Step 6: Assign securebindings.

    Prerequisite for this step is that you have already installed the IIS Administrative Scripts located as below:   (If not install from your Control Panel>Add Remove Windows Components>Choose IIS to reinstall the AdminScripts >

    image

    Now go to IIS MMC, note Identifier for each of your Sites that will require the new certificates to be assigned.

    Go to command prompt and change directory to the above highlighted AdminScripts folder.

    Enter following command for each of the IIS sites that you want the certificate is assigned.

    cscript.exe adsutil.vbs set /w3svc/<site identifier>/SecureBindings ":443:<host header>"

    where host header is the host header for the Web site, for example, site1.mydomain.com.

    After above steps are completed, go back to IIS MMC and start each of your sites.

    Step 7: Add to Trusted Sites

    From your development server browser, add your wildcard domain name as trusted site. (*.mydomain.com)

    image 

    Validation

    From the installation of IIS Resource kit you will also get SSL Diagnostic tool installed.

    image

    Run the SSL Diagnostics tool. The tool will check SSL certificate for each of the IIS sites and provide report as below.

    image

    In the above screen cap, I have masked the domain names I had used, but you should expect to see your respective wild card domain name.

    Next browse to each of your IIS Sites from the Browse and you should be able to see your site without any certificate warning.

    Hope these steps will simplify your SSL implementation for the purpose of development

    Things to consider beyond just SSL
    • Simulate SSL per site in your development environment.
    • Address your browse warning for mixed content if present.
    • Will update further if anything comes across my attention.