GoDaddy, you broke my heart

For starters, you broke my website.

Recently, I have sporadically seen random characters being appended to my domain redirect URLs, so that when somebody types in this:

They end up here:

Instead of here:

Notice the random 6 characters that are being appended to the first URL: UMheZ. Since my server does not have a path/page/route that maps to these random characters, it does not render a page.

It seems like I am not the only one having this problem:

So why do I claim that GoDaddy broke my heart, and not just my website? Because of the absolute apathy, borderline antipathy, they have shown to their customers. I would expect them to fix this. But I get it-the solution may be complex, it may be caused by somebody else’s issue, the Internet is a complex place.

But there is simply no excuse to say “not my fault” and consider the case closed. We are your customers. Tell us “sorry this is broken, we are working to fix it and will let you know once we do; in the meantime, adding a ? to the end of your redirect domain usually solves the problem; I have sent you a screencast of me doing that if you want to see exactly how it is done.” The fact that they are so eerily quiet on this thread brings me to a sad realization: the love has gone from our relationship, and it has been for years now. We are only together because it is too expensive for us to break up.

But I think it is time for me to get back in the game, and find a domain partner who really likes me, who brings me unexpected surprises that make me happy, and works hard to make it right when something bad happens. Yep, we are breaking up.

If you have had a positive experience with GoDaddy that can offset this experience, I would love to hear about it. If GoDaddy broke your heart too, tell me in the comments below.

Have a great day.



Fixed: Failed Publish to Production on Azure

For some reason, my deployments to my production service on stopped working. It looked like some automated process had made a change to my deployment scripts, so I deleted them and started over. Then, I ran into this annoying problem….which I had resolved before, but had just deleted the solution:

Error Code: ERROR_FILE_IN_USE More Information: Web Deploy cannot modify the file ‘INV.dll’ on the destination because it is locked by an external process. In order to allow the publish operation to succeed, you may need to either restart your application to release the lock, or use the AppOffline rule handler for .Net applications on your next publish attempt. Learn more at: Error count: 1.

This time, I am going to store my solution so that when this happens again, I will know exactly what to do. The solution is actually quite simple:

  • go to your visual studio website (e.g.,
  • Navigate to the appropriate project > release
  • Click on the ellipses (…) next to the release definition (on the left side) that controls your failing deployment, and select edit
  • Click on the tasks tab, and select the environment that you are publishing to (probably Production)
  • Click on Deploy Azure App Service
  • There is an option for “Additional Deployment Options” that appears on the settings page for the Deploy Azure App Service. You guessed it: click that!
  • Finally, check the “Take App Offline” checkbox and save your way back home.

Up and running with TensorFlow on Windows

I was having some problems installing TensorFlow the way that is described on the webiste

I tried both pip and Anaconda, and failed.

The pip installation gave me this error:

Importing the multiarray numpy extension module failed. Most
likely you are trying to import a failed build of numpy.
If you’re working with a numpy git repo, try `git clean -xdf` (removes all
files not under version control). Otherwise reinstall numpy.

The Anaconda gave me this error message:

tensorflow-1.2.1-cp35-cp35m-win_amd64.whl is not a supported wheel on this platform

and later

ImportError: No module named ‘tensorflow’

I then tried to pip using this:

pip install tensorflow

And, if I used an Anaconda command prompt, everything worked like a (sort of) charm. You can test it out by:

activate tensorflow


and then running this in the python console:

>>> import tensorflow as tf
>>> hello = tf.constant('Hello, TensorFlow!')
>>> sess = tf.Session()
>>> print(

Not sure if one of the original 15 attempts set this one up for success, but if you are having a hard time getting Tensorflow installed on your windows machine, this mine be the trick for you as well.



New Product Development Process

There is no one-size-fits-all process for new product development, but I liked this one:

I saw it while reading CTL.SC1x Key Concepts􀀁MITx MicroMasters in Supply Chain Management (link), and I believe that it is extracted from “Cooper, Robert (2001) Winning at New Products.”

I am hoping to do a write up on how this conventional product development pipeline struggles to add value to an enterprise. The primary challenge is connecting an unpredictable, non-linear invention process with a corporate framework that requires predictability. More to follow soon.

Setting up an Azure Data Lake and Azure Data Factory using Powershell

#first ensure that you have an Azure Data Lake that you want to use for ODX
#$resourceGroups = Get-AzureRmResourceGroup
#$azureDataLakeNames = "";
# foreach ($resourceGroup in $resourceGroups) {
# $azureDataLake = Get-AzureRmDataLakeStoreAccount -ResourceGroupName $resourceGroup.ResourceGroupName
# $azureDataLakeName = $azureDataLake.Name
# $azureDataLakeNameLength = $azureDataLakeName.Length
# $azureDataLakeNameLength -gt 0
# if ($azureDataLakeNameLength -gt 0) {
# $azureDataLakeNames += " " + $azureDataLake.Name + " (resource group: " + $resourceGroup.ResourceGroupName + " & location: " + $resourceGroup.Location + ")"
# }
# }
# "-----------"
#"DataLakeNames: " + $azureDataLakeNames
#REQUIRED: you must enter a unique appname which will be used as the security principal
$appname = "inventose"
#OPTIONAL: change the password for the security principal password
$password = "Xyzpdq"
#run the above script, and replace DATALAKESTORENAME with the appropriate name/rg/location from your existing data lake store; or enter a new name to have a data lake created
$dataLakeStoreName = $appname
$odxResourceGroup = $appname
$dataLakeLocation = "Central US" #Central US, East US 2, North Europe
#recommended to use the same resource group as the data factory for simplicity, but you can use any resource group or enter a new name to create
$dataFactoryResourceGroup = $dataLakeStoreResourceGroup
#specify where you want your data factory - current options are East US, North Europe, West Central US, and West US
$dataFactoryLocation = "West US"
#create odxResourceGroup, if it does not exist
Get-AzureRmResourceGroup -Name $odxResourceGroup -ErrorVariable notPresent1 -ErrorAction 0
if ($notPresent1)
New-AzureRmResourceGroup -Location $dataLakeLocation -Name $odxResourceGroup
#create data lake, if it does not exist
Get-AzureRmDataLakeStoreAccount -Name $dataLakeStoreName -ErrorVariable notPresent2 -ErrorAction 0
if ($notPresent2)
New-AzureRmDataLakeStoreAccount -Location $dataLakeLocation -Name $dataLakeStoreName -ResourceGroupName $odxResourceGroup
$homepage = "" + $appname
#create security principal, if it does not exist
$app = New-AzureRmADApplication -DisplayName $appname -HomePage $homepage -IdentifierUris $homepage -Password $password
$app = Get-AzureRmADApplication -DisplayName $appname
$servicePrincipal = New-AzureRmADServicePrincipal -ApplicationId $app.ApplicationId
Start-Sleep 10
New-AzureRmRoleAssignment -RoleDefinitionName "Contributor" -Id $servicePrincipal.Id -ResourceGroupName $odxResourceGroup
New-AzureRmRoleAssignment -RoleDefinitionName "Data Factory Contributor" -Id $servicePrincipal.Id -ResourceGroupName $odxResourceGroup
New-AzureRmRoleAssignment -RoleDefinitionName "Reader" -Id $servicePrincipal.Id -ResourceGroupName $odxResourceGroup
#Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path / -AceType User -Id $app.ApplicationId -Permissions All
Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path / -AceType User -Id $servicePrincipal.Id -Permissions All
Get-AzureRmDataLakeStoreItem -Account $dataLakeStoreName -Path /ODX -ErrorVariable notPresent3 -ErrorAction 0
if ($notPresent3)
New-AzureRmDataLakeStoreItem -Folder -AccountName $dataLakeStoreName -Path /ODX
Set-AzureRmDataLakeStoreItemAclEntry -AccountName $dataLakeStoreName -Path /ODX -AceType User -Id $servicePrincipal.Id -Permissions All
#Start-Sleep 60 #there seems to be a lag between when these permissions are added and when they are applied...trying 1 minutes to start
$subscription = Get-AzureRmSubscription
$subscriptionId= ($subscription).Id
$tenantId = ($subscription).TenantId
#ensure there are permissions
#Get-AzureRmDataLakeStoreItemAclEntry -Account $dataLakeStoreName -Path /
#get information on datalake
$dataLake = Get-AzureRmDataLakeStoreAccount -Name $dataLakeStoreName
#here is a printout
$text1= "Azure Data Lake Name: " + $dataLakeStoreName + "`r`n" +
"Tenant ID: " + $tenantId + "`r`n" +
"Client ID: " + $app.ApplicationId + "`r`n" +
"Client Secret: " + $password + "`r`n" +
"Subscription ID: " + $subscriptionId + "`r`n" +
"Resource Group Name: " + $odxResourceGroup + "`r`n" +
"Data Lake URL: adl://" + $dataLake.Endpoint + "`r`n" +
"Location: " + $dataFactoryLocation
Out-File C:UsersMattDyorDesktopDataLake.ps1 -InputObject $text1

This is the Azure Powershell You Are Looking For

I was getting an error telling me to login to Azure even after I had just logged in:

Run Login-AzureRmAccount to login

There was something out of alignment with whatever version of PowerShell and Azure I had installed. After installing this version of powershell, I was up and running in no time. I read a number of other articles that told me to do things like Update-Module, and that did not work for me…but your mileage may vary.

Good luck!



Deleting Multiple Items on Azure

Probably for good reason, there is no easy way to delete a bunch of items from the Azure portal. This means drilling into each of the different items, clicking on delete, possibly confirming something by typing its name into a confirmation screen, waiting a minute for the operation to complete, and then advancing to the next. Totally acceptable for production assets, where you may seriously regret deleting the wrong database (and there is no undo!).

But, when you are developing on Azure, you end up creating a ton of assets, and deleting them one by one is slow (and you actually may go on auto-pilot and end up deleting something that DOES matter). Even worse, some assets do not have a delete option in the portal (hello ADF V2…still preview, so I get it).

The trick to deleting a bunch of items is to use a resource group. Start by navigating to one of the items that you want to delete, click “change” next to resource group, and then you have the option to move a number of items into a new resource group called…trash or something like that. Once all of the items have finished moving to the new resource group, you can delete everything in once fell swoop by deleting the resource group. Pretty clever, huh:).

Have any tips and tricks to share on the Azure portal? I would love to hear them.