Skip to main content


The quickest, easiest way to make and connect to an Azure VM

$creds = Get-Credential
$vmname = "TestVM"
$vm = New-AzureRmVm -Name $vmname -Credential $creds
$fqdn = $vm.FullyQualifiedDomainName

$username = $creds.UserName
"full address:s:$fqdn" | out-file "~\desktop\$vmname.rdp"
"username:s:$vmname\$username" | out-file "~\desktop\$vmname.rdp" -Append

If you want a faster VM, add the -Size parameter, like -Size "Standard_DS3_v2"
Recent posts

Recurring issues in Visual Studio projects

Common issues and solutions to errors I come across in Visual Studio on a semi-regular basis:

"Could not find a part of the path … bin\roslyn\csc.exe"
Run this in Package Manager Console:
Update-Package Microsoft.CodeDom.Providers.DotNetCompilerPlatform -r

"Creation of the virtual directory http://localhost:12345/ failed with the error: Cannot create the Web site"
Update .csproj file and set the following:
In the <VisualStudio> element, set the following:

"Could not load file or assembly [assembly name] or one of its dependencies. The located assembly's manifest definition does not match the assembly reference"
If it's a nuget package, check version of the assembly in packages.config, web.config and "References" section in solution explorer, one of them is probably wrong. Try reinstal…

Get the maximum length of every field in a CSV, Excel etc

While setting up tables in our data warehouse for use with some external APIs, it was getting a little tedious to figure out the max length of some long fields to get the write size for CREATE TABLE statements, so I wrote this little script to calculate it for me. Yeah, it's kinda messy, so sue me (or don't). You can swap out the import-csv part to also import from Excel or whatever else

$csv = import-csv ~\Downloads\whatever.csv
$fields = $csv | select -first 1 | get-member | where MemberType -eq 'NoteProperty'

foreach($field in $fields)
    $data = $csv | select -ExpandProperty $field.Name
    $lengths = $data | % {$_.Length}
    $max = ($lengths | measure -Maximum).Maximum
    $field | add-member -NotePropertyName MaxLength -NotePropertyValue $max

$fields | select Name, MaxLength | Format-Table -AutoSize

Handling double-quoted CSVs in Azure Data Factory Pipelines

Azure Data Factory by default uses a backslash as the escape character for CSVs, but we ran into an issue with this today processing one of the CSV files from  As with most CSVs they use quotes around values as normal and with double-quotes for empty values, but they also use double-quotes to escape quotes within non-empty values. This probably sounds confusing, so here's an example:

"column 1","column 2","","column 4 value is ""sort of"" like this"

The ADF pipeline failed because the double-quotes don't get escaped correctly:
ErrorCode=UserErrorSourceDataContainsMoreColumnsThanDefined, found more columns than expected column count.

The solution was to change the "Escape character" property on the dataset, by clicking the "Edit" button beneath it and manually entering "", since "" isn't in the list of escape characters.  I didn't think this would work at fir…

Handy SOQL query snippets

To count how many people logged in for the last time each year, for only inactive users:

select calendar_year(lastlogindate) LoginYear,count(id) Total
from user
where isactive = false
and lastlogindate <> null
group by calendar_year(lastlogindate)
order by count(id) desc

Formats for dealing with dates in Salesforce

The correct way to format dates for bulk importing CSVs into Salesforce looks like year-month-day, so today would look like 2019-06-17.  These default to the timezone set for the user importing them, so there's no need to convert to UTC.  Set this in Excel by

Select the column/s containing datesRight click on them and select "Format Cells" from the menuSelect the format "2012-03-14"Save as CSV (not the UTF-8 type of CSV)

For programmatic imports, the correct .NET Framework DateTime format string to use for a Salesforce Date/time is:

Make sure it's in UTC! If the time represents the current moment, use DateTime.UtcNow

You might be tempted to use "O" (round-trip pattern) since it looks sort of similar, but don't.  It won't cause an error, but the time will be parsed incorrectly.

For querying with timezones, use a format like: 2019-08-01T00:00:00.000+0010