Posts

Showing posts from August, 2017

SharePoint 2010 on Windows 10: Solving COMException / Unknown error (0x80005000)

Image
This is not a problem I expected to solve, but it happened anyway. A client is using SharePoint 2010 and I need a local development farm. I started with the usual configuration requirements: Add <Setting Id="AllowWindowsClientInstall" Value="True"/> to the Setup config Created a script to make a new configuration database so that I dont have to join a domain $secpasswd = ConvertTo-SecureString "MyVerySecurePassword" -AsPlainText -Force $mycreds = New-Object System.Management.Automation.PSCredential ("mydomain\administrator", $secpasswd) $guid = [guid]::NewGuid(); $database = "spconfig_$guid" New-SPConfigurationDatabase -DatabaseName $database -DatabaseServer myservername\SharePoint -FarmCredentials $mycreds -Passphrase (ConvertTo-SecureString "MyVerySecurePassword" -AsPlainText -force) The PowerShell script was generating the error. The solution is simple - you need to enable IIS 6.0 Management Compat...

C#: How do I call an API from a console application?

I have used the following code to call an API and pass in some parameters in the request header. Have a read  here  about the Encoding.GetEncoding(1252) . private static string ExecuteAPI(string url, Dictionary<string, string> parameters, string method = "POST", string body = " ", int timeOut = 180000) {     var request = (HttpWebRequest)WebRequest.Create(url);     request.Method = method;     foreach (var param in parameters)     {         request.Headers.Add(param.Key, param.Value);     }     if ((method == "POST") || (method == "PUT"))     {         if (!string.IsNullOrEmpty(body))         {             var requestBody = Encoding.UTF8.GetBytes(body);             request.ContentLength = requestBody.Length;             request.ContentT...

Azure: Why is my blob 0KB when I move it?

I was archiving documents in Blob Storage and found that the target files were always being rendered as 0 KB. I was using the following code: var storageAccount = CloudStorageAccount.Parse("CONNECTIONSTRING"); var blobContainer = storageAccount.CreateCloudBlobClient().GetContainerReference("container"); var blob = blobContainer.GetBlobReference("myfilename") // Get the source as a stream Stream stream = new MemoryStream(); blob.DownloadToStream(stream); var destBlob = blobContainer.GetBlockBlobReference(blob.Name); destBlob.UploadFromStream(stream); blob.Delete(); The code should work, except for the fact that the stream needs to be repointed to the start for the download to work. blob.DownloadToStream(stream); (from above) stream.Position = 0; And now the copy/move works

Load data into a SQL Database using SqlBulkCopy, Entity Framework and reflection

I recently had the requirement to bulk load data into a SQL database. Easy enough, but since I was using the Entity Framework, most of the work has already been done for me. The plan is as follows: Use SqlBulkCopy with a DataTable as an input. In order to do this, I will need to create some headers and then add the required data as rows before blasting them into the SQL Database. ConcurrentQueue<MyObject> items = GetMyItems(); // method not included Type t = (new MyObject()).GetType(); var dt = SetHeader(t); AddItems(dt, items); Load(dt, "MyTable"); // Add the properties as columns to the datatable private DataTable SetHeader(Type t) { var dt = new DataTable(); PropertyInfo[] pi = t.GetProperties(); foreach (var p in pi) { if (string.Compare(p.PropertyType.Name, typeof(Guid).Name, true) == 0) { dt.Columns.Add(p.Name, typeof(Guid)); } else { dt.Columns.Add(p.Name); } } return dt; } // add each row as a CSV to the D...