Sunday, 6 August 2017

SharePoint 2010 on Windows 10: Solving COMException / Unknown error (0x80005000)

This is not a problem I expected to solve, but it happened anyway. A client is using SharePoint 2010 and I need a local development farm.

I started with the usual configuration requirements:
Add <Setting Id="AllowWindowsClientInstall" Value="True"/> to the Setup config

Created a script to make a new configuration database so that I dont have to join a domain
$secpasswd = ConvertTo-SecureString "MyVerySecurePassword" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ("mydomain\administrator", $secpasswd)

$guid = [guid]::NewGuid();
$database = "spconfig_$guid"

New-SPConfigurationDatabase -DatabaseName $database -DatabaseServer myservername\SharePoint -FarmCredentials $mycreds -Passphrase (ConvertTo-SecureString "MyVerySecurePassword" -AsPlainText -force)

The PowerShell script was generating the error.

The solution is simple - you need to enable IIS 6.0 Management Compatibility on the machine

The script ran and I now have a very old SharePoint farm to use

Wednesday, 2 August 2017

Azure: Why is my blob 0KB when I move it?

I was archiving documents in Blob Storage and found that the target files were always being rendered as 0 KB.

I was using the following code:

var storageAccount = CloudStorageAccount.Parse("CONNECTIONSTRING");
var blobContainer = storageAccount.CreateCloudBlobClient().GetContainerReference("container");
var blob = blobContainer.GetBlobReference("myfilename")

// Get the source as a stream
Stream stream = new MemoryStream();
blob.DownloadToStream(stream);

var destBlob = blobContainer.GetBlockBlobReference(blob.Name);
destBlob.UploadFromStream(stream);

blob.Delete();

The code should work, except for the fact that the stream needs to be repointed to the start for the download to work.

blob.DownloadToStream(stream); (from above)
stream.Position = 0;

And now the copy/move works

C#: How do I call an API from a console application?

I have used the following code to call an API and pass in some parameters in the request header.
Have a read here about the Encoding.GetEncoding(1252).

private static string ExecuteAPI(string url, Dictionary<string, string> parameters, string method = "POST", string body = " ", int timeOut = 180000)
{
    var request = (HttpWebRequest)WebRequest.Create(url);
    request.Method = method;

    foreach (var param in parameters)
    {
        request.Headers.Add(param.Key, param.Value);
    }

    if (method == "POST")
    {
        if (!string.IsNullOrEmpty(body))
        {
            var requestBody = Encoding.UTF8.GetBytes(body);
            request.ContentLength = requestBody.Length;
            request.ContentType = "application/json";
            using (var requestStream = request.GetRequestStream())
            {
                requestStream.Write(requestBody, 0, requestBody.Length);
            }
        }
        else
        {
            request.ContentLength = 0;
        }
    }
   else
   {
      request.ContentLength = 0;
   }

    request.Timeout = timeOut;
    request.CachePolicy = new RequestCachePolicy(RequestCacheLevel.BypassCache);

    string output = string.Empty;
    try
    {
        using (var response = request.GetResponse())
        {
            using (var stream = new StreamReader(response.GetResponseStream(), Encoding.GetEncoding(1252)))
            {
                output = stream.ReadToEnd();
            }
        }
    }
    catch (WebException ex)
    {
        if (ex.Status == WebExceptionStatus.ProtocolError)
        {
            using (var stream = new StreamReader(ex.Response.GetResponseStream()))
            {
                output = stream.ReadToEnd();
            }
        }
        else if (ex.Status == WebExceptionStatus.Timeout)
        {
            output = "Request timeout is expired.";
        }
    }

    return output;
}

Load data into a SQL Database using SqlBulkCopy, Entity Framework and reflection

I recently had the requirement to bulk load data into a SQL database. Easy enough, but since I was using the Entity Framework, most of the work has already been done for me.

The plan is as follows:
Use SqlBulkCopy with a DataTable as an input. In order to do this, I will need to create some headers and then add the required data as rows before blasting them into the SQL Database.

ConcurrentQueue<MyObject> items = GetMyItems(); // method not included

Type t = (new MyObject()).GetType();
var dt = SetHeader(t);
AddItems(dt, items);
Load(dt, "MyTable");

// Add the properties as columns to the datatable
private DataTable SetHeader(Type t)
{
var dt = new DataTable();
PropertyInfo[] pi = t.GetProperties();

foreach (var p in pi)
{
if (string.Compare(p.PropertyType.Name, typeof(Guid).Name, true) == 0)
{
dt.Columns.Add(p.Name, typeof(Guid));
}
else
{
dt.Columns.Add(p.Name);
}
}

return dt;
}

// add each row as a CSV to the DataTable
private bool AddItems(DataTable dt, ConcurrentQueue<MyObject> items)
{
object table = new object();
Parallel.ForEach(items, item =>
{
object[] values = ObjectToArray(item);
lock (table)
{
dt.Rows.Add(values);
}
});

return true;
}

private bool Load(DataTable dt, string tableName)
{
var copy = new SqlBulkCopy(Constants.DATABASE_CNN);
copy.DestinationTableName = tableName;
copy.WriteToServer(dt);
return true;
}

// convert the object to a string array
private string[] ObjectToArray(object obj)
{
if (obj == null)
{
throw new ArgumentNullException("obj", "Value can not be null or Nothing!");
}

Type t = obj.GetType();
PropertyInfo[] pi = t.GetProperties();

var returnValue = new string[pi.Length] ;

for (int index = 0; index < pi.Length; index++)
{
if (pi[index].GetValue(obj) == null)
{
returnValue[index] = "";
}
else
{
returnValue[index] = pi[index].GetValue(obj).ToString();
}
}

return returnValue;
}