Tuesday, 28 October 2014

Microsoft Dynamics CRM 11 and SharePoint 2010 "Alternate Access Mapping" error

Last week a local firm contacted us about an issue that they have with Dynamics CRM 11  and SharePoint 2010 integration, but first I will make you a brief overview of the integration between those two products.
Dynamics CRM 11 can be integrated with SharePoint in order to have centralized Document management functionality based on SharePoint.
CRM 11 On-Premise  can be combined with MOSS 2007, SharePoint 2010 and SharePoint 2013 (All editions including Online). The integration will allow us to have one location where to store the documents from the CRM entities and records, the users can work with the documents from the CRM as well in SharePoint, no documents will be stored in the CRM itself.
At this time we do not have standard support from Microsoft for Microsoft Office SharePoint Server 2007(MOSS 2007), so this article is mainly for the integration with SharePoint 2010, also the issue that is the reason for this article is with SharePoint 2010.
The integration with SharePoint 2010 is fairly simple process, since the CRM is using SharePoint as a document storage. As this is an overview of the integration I am not getting in details.
The process is very straightforward and can be done from the CRM interface. You can check following article for detailed procedure for integrating this products, Here.
The important part(but not mandatory) here is the List Component that is deployed on the SharePoint side, it gives us integrated look of file "grid" and the ability to create the libraries and folders automatically from the CRM. Below you can see how it looks with installed list component and without it.

With installed list component:

CRM11 List Component

Without list component, the SharePoint page is shown as it is in iframe:

CRM11 No List Component

The issue of the customer started when he migrated the sql instance of the SharePoint to a different server, the customer started to receive below error when a new record is created in the CRM.

CRM11 Alternate Access mapping error

Error:

An error occurred while loading the page. The URL may not have been mapped in the SharePoint Server. Ask your system administrator to check the Configure alternate access mappings settings in the SharePoint Central Administration.

It is clear that the CRM was trying to create a new folder for the record in the Accounts  library(in my case). If the users open record with existing folder they are able to view the content correctly, they can upload a new document, but the file view fails to refresh.
My first thoughts were "Well there must be something wrong with the AAM from Sharepoint side.".
Unfortunately after a couple of hours troubleshooting and testing there was nothing wrong with the Alternate Access Mapping.
According to the customer nothing was changed except the migration of the SQL instance. I am not sure what exactly were the versions of the SharePoint and the CRM because I had limited time for access on the systems.
However, I decided that there must be something wrong with the list component. I downloaded the latest available package and deployed it in the Document location site.
Can you guess what happened? Yes everything started to work as charm!
Since I cannot confirm the exact versions of the products and what were the conditions that led to this behaviour. I think that if you get such errors it is worth trying to upgrade to the latest list component solution available.
At this time the latest package version is 05.00.9690.4159 from 7/1/2014.

You can download the solutions for SharePoint 2010 and 2013 HERE

Monday, 27 October 2014

SPDeletedSite and SharePoint Deleted Site Collections alert script

Couple of weeks ago I received a question from one of our customers if there is a way to get alert when a site collection from their SharePoint 2013 farm is deleted. This was kind of important for them because the power to create and delete site collections is delegated to project owners, team leaders etc. The customer often receives a requests for restoring a site that was deleted by mistake or they suddenly notice that a site with important information just disappears(deleted by the owner). The fastest way to restore such deleted site is to use the command Restore-SPDeletedSite, but often the request for restore comes after 30 days and the sites are permanently deleted.
As far as I know, there is no OOTB feature that can notify for deleted site collections, so my proposal to the customer was to write a PowerShell script that can track the deleted site collections and send notifications. The script is fairly simple it is just getting the current deleted sites with the command Get-SPDeletedSite -WebApplication <WebAppUrl> and it compare the result with the result from the last run and if there are new deleted sites it will send a mail with details like below one:

Alert for Deleted SharePoint site collection

And since the key point in this script is the retrieval of SPDeletedSite I would like to explain a bit more for this feature and and share the history of it.
I guess that every SharePoint administrator knows about this feature in SharePoint 2013 and a big part of us are so happy that we have it.
The story of it starts with the release of the SharePoint 2010 with a capability called "Gradual Site Delete". This capability was designed in order to mitigate performance degradation or even service interruption caused by so called Lock-escalation that may happen in the content database if we attempt to delete very large site collection. This was an issue in WSS 3.0 and Microsoft SharePoint Server 2007(MOSS 2007).
The short story around "Gradual Site Delete" is that when a user deletes site collection, the site collection is not instantly deleted, it is marked for deletion and the content becomes unavailable. Then for the actual deletion is responsible a timer job definition called "Gradual Site Delete" that is executed on daily schedule by default(configurable).

Gradual Site Delete Timer Job

This timer job deletes the site collection from the content database in small enough portions to prevent lock escalation. If you want to read more on the subject, please check the post from Bill Baer.

Then with SharePoint 2010 SP1 we received 3 additional cmdlets that are allowing us to work with site collections that are "marked" for deletion, but not deleted yet. They are the same as in SharePoint 2013 - Get-SPDeletedSite, Restore-SPDeletedSite, Remove-SPDeletedSite and Move-SPDeletedSite (Only in SharePoint 2013).
And now with the cmdlet Restore-SPDeletedSite we can restore site collection that was deleted by the end user.
As I said eventually this 'deleted' sites should be permanently deleted by the "Gradual Site Delete" timer job. The exact time how many days the site collection should be kept in "Gradual Site Delete" phase as SPDeletedSite object (marked for deletion) before permanent deletion, depends on the time settings of your site collection recycle bin that is set on Web Application level.

SharePoint 2013 Recycle Bin Settings

By default the Recycle Bin is enabled and the retention period for the deleted items is 30 days. This means that if a site collection is deleted it will stay as "restorable" SPDeletedSite object for 30 days.
Be aware that when you delete site collection in powershell with the command Remove-SPSite, you need additionally to specify parameter  -GradualDelete in order to use the Gradual Site Delete. 
If this parameter is not used the site collection will be instantly deleted, it is strongly recommended to use Gradual Site Delete for large site collections.

Download the script from: Technet Gallery

Monday, 20 October 2014

Office 365 Demo Tenant available for Microsoft Partners

In this post I am going to tell you a bit more for the Office 365 Demo Tenants available for Microsoft Partners. This is not something new, but I think it is very useful and not popular enough.
Last week I needed to do a SharePoint Online demo, this is not my primary work so I was not not prepared with any Office 365 demo tenant and content, so this came up very useful.
If your company is Microsoft Partner and your Live account is associated with your company you can take advantage of the free Office 365 demo tenant for 90 days in order to do your shiny Office demos. This will include the full Office 365 stack, Exchange, Lync, SharePoint Online, Yammer, Power BI, demo content in order to demonstrate all the good things and 25 users. There is an option if you already have some spare Office 365 tenant to receive only the demo content, but I haven't tested this option.
First thing you need to do is to go to https://www.microsoftofficedemos.com/ and log in as Partner

Microsoft Office Demo Site


After the successful login you go to GET DEMO in order to provision a new Demo Tenant.
There you will see 4 options:

1. Create new demo environment  - This is the "full" package(demo tenant + content)

2 .Refresh a demo environment created with this tool - Do not expect this option to refresh your expired demo tenant after 90 days. If a demo tenant reaches 90 days you need to create new one if you need it. The explanation under the link of this options is descriptive enough(never tested it).

3. Just get demo content - I have my own O365 tenant! - As I said above this should provision only the Never tested this, because I do not have non-production O365 tenant where I can try it.

4. Create an empty Office 356 tenant - I guess that this will create you a Demo tenant without the content.

Microsoft Office Demo Tenant Options


If you you choose the 1 option, a new  tenant with content, you will need to complete two more steps where you should specify the type of the content(Standard, Productivity Solution, Industry-Specific). At the second step you should specify the information for the tenant, like Tenant Name and your Sign-Up information(your mail, phone ...etc).
The next thing you will have to do is to wait. You will be redirected to the below page where you can follow the deployment status of the different components.

Microsoft Office Demo Tenant Provisioning Status

Be aware that if the provisioning process last for more than 48 hours there is a chance to be canceled and you will need to start all over again, you will receive regular status updates via email. You can also check the status in the Microsoft Office Demo site - CHECK DEMO.
So if you need to prepare important demo/presentation do not leave this for the last moment. 
As I said above you will receive 25 demo users to play by the scripts provided in the Microsoft Office Demo site Resources. There will be scripts available somewhere in the SharePoint, in my case there were in https://<MyTenant>.sharepoint.com/Shared Documents/Forms/AllItems.aspx.
Once the Demo is provisioned you will receive the Tenant Admin credentials and you are ready to explore and demonstrate Office 365 to your future customers. 
While you are deploying the Demo you will see the warning that you should not share credentials from this Demo with your customers! Please do not share them!

SharePoint Online Demo Sites

Friday, 10 October 2014

Create ZIP Archive in PowerShell and SharePoint IIS and ULS archiving scripts

In my career as Windows Web hosting administrator and now as dedicated SharePoint Administrator I have always needed a way to clean up the IIS logs since the IIS does not provide such functionality.
I guess that this is an issue for everyone involved in some sort of application management that is hosted on IIS.
Most of us leave Log Rollover settings to Schedule, Daily and we end up with bunch of files in our log directory. The size of the logs depends on how many hits the application is receiving. If the application is some public website the logs can become quite big and the volume where the logs are stored  can run out of space (in any case you do not want the IIS logs stored on System drive or on drive with application files).
So, the most recent experience I had was with one of our customers that has their public websites hosted on SharePoint 2013. The sites are quite busy and we have huge 1GB logs almost everyday and with NTFS compression the files are still around 300MB size on the disk.
Since it is a good practice to keep this log files as long as we can, we can have two options.
Option one is to zip(or use our archiving tool of choice) all log files,put it away and delete the original file in order to save some space and option two to write or find a script to do this for us.
I found many scripts for archiving IIS logs. Most of them are using 7-zip to do the zip archives or the cmdlet Write-Zip that comes from the PowerShell Community Extensions. Both tools are using some sort of a 3rd party compiled code that I do not want to run on the servers that are hosting the public sites of our high profile customer.
At the moment we don't have version of PowerShell that has native cmdlets for working with .zip files. The new PowerShell version 5 will have native cmdlets that can create and extract zip files, but it is still in Preview version and it is not recommend for production environments. If you want to check it out there is a preview of the Windows Management Framework V5 available for download.
So, I needed to write my own script to do the work without using any 3rd party tools.
With some research I found some classes that are added to the System.IO.Compression Namespace in .NET 4.5 that might do the work. The .NET Framework 4.5 is a prerequisite for every SharePoint 2013 so I do not need to install anything additional.
Below you can see the function that I have used in the IIS Log Archiving script.


Function New-LogArch{
    [CmdletBinding()]
    Param (
        [parameter(Mandatory=$true)]
        [System.IO.FileInfo]$file,
        [parameter(Mandatory=$true)]
        [string]$Location
    )
    PROCESS{
        Add-Type -Assembly System.IO.Compression.FileSystem
        $tmpFolder = $Location + '\' + $File.BaseName
        New-Item -ItemType directory -Path $tmpFolder | Out-Null
        $destfile = $Location + "\" + $File.BaseName + ".zip"
        $logPath = $file.FullName
        $zippedName = $file.Name
        $compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
        [System.IO.Compression.ZipFile]::CreateFromDirectory($tmpFolder,$destfile,$compressionLevel, $false )
        $archive = [System.IO.Compression.ZipFile]::Open($destfile, [System.IO.Compression.ZipArchiveMode]::Update )
        [System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($archive, $logPath, $zippedName, $compressionLevel) | Out-Null
        $archive.Dispose()
        Remove-Item $tmpFolder -Force

    }
}


The function takes two parameters, the information for the file where it can get the literal path to the log file and the location where the archive will be saved.
The first thing we do with the System.IO.Compression classes is to define the compression level in the variable $compressionLevel, I am setting Optimal in order to get smallest file size possible.
Next thing is to create an empty zip archive from a temporary, empty directory I have created earlier. I was unable to find how to create the archive directly from the log file.
When we have the zip file created with the correct Location,Name and Compression Level I "open" it in the variable $archive.
Once we have the archive file opened we use the method CreateEntryFromFile in class System.IO.Compression.ZipFileExtensions, with this method we can add a file to zip archive and we add the original log file to the archive.
Finally we invoke Dispose() method on the $archive, because when we open the zip archive with Update argument the file is locked.
Now when I have the key component that will let me zip the log files I created the IIS log Archive and Cleanup script.
It will find the log location of all IIS sites, will check when they are modified and if they are older than the retention period that you have defined as parameter it will archive the log in the destination you have specified and will delete the original file. You will have a zip file with the name of the original log file that contains only one log file. The script will also check for old archive files and if they are older than the value you have specified it will delete the zip file.
I also wrote a script that can archive the SharePoint ULS logs, because it is a good practice to have them in case you need to do performance analysis and troubleshooting for example.



Download "IIS Log Archive and CleanUp Script" from: Technet Gallery

Download "ULS Log Archive Script" from: Technet Gallery