Creating and Configuring Certificates for Azure PowerShell Management

If you’ve worked at all with Microsoft (previously Windows) Azure, you’ll know that the Management Portal can only get you so far. Eventually, you’re going to run into the desire to script tasks, or the need to use PowerShell because the functionality doesn’t yet exist in the Management Portal.

Once you find yourself open the Azure PowerShell command prompt, you’ll first have to configure your session to connect to the Azure Subscription you want to manage. Once you have to do that more than once, you’ll be looking for ways to automate your subscription selection. Fortunately, there is an easy way to do that.

In this post, I’ll show you how to create a certificate, associate it with your subscription, and configure your machine using PowerShell to use that subscription by default. The steps I show here are designed to work in a development environment. Please be sure to review them to ensure that the security settings are compliant with your environment if you wish to do this process in production.

This post will cover, in brief, the following topics:

  • Creating a personal certificate
  • Exporting and uploading the certificate to Azure
  • Configuring Azure PowerShell environment on local machine

Creating a Personal Certificate

The easiest way to create a personal certificate is to use the makecert.exe utility which is installed with Visual Studio. Follow the steps below to create a certificate that is compatible with Windows Azure.

1.) Open a Visual Studio Command Prompt in Administrator Modeimage

2.) Enter the following command. Note that Azure-compatible certificates must have a key length of at least 2048 bits and should be stored in the Personal certificate store. Replace the <CertificateName> placeholder with the name of the desired certificate. For more information on creating and managing certificates compatible with Windows Azure, read this TechNet article (http://msdn.microsoft.com/en-us/library/azure/gg551722.aspx)

makecert -sky exchange -r -n “CN=<CertificateName>” -pe -a sha1 -len 2048 -ss My “<CertificateName>.cer”

3.) Upon receiving a Successful response from the command open the “Manage User Certificates” console.

image

4.) Browse to the Personal store and verify the certificate just created is in existence.image

Exporting and Uploading the Certificate to Azure

Once you have the certificate created, you’ll need to export a copy of it in .CER format to upload to Azure. By doing this, you’re allowing Azure to verify that a connection coming from your machine is valid and should be trusted.

The certificate that you export shouldn’t contain your private key, so be careful. If you export a copy of your certificate with a private key and someone were to get a hold of that certificate, they could pretend to be YOU in Azure’s eyes, opening up possibility for a security breach.

1.) From the Certificate Store console, right click on the certificate created in the above procedure and select All Tasks –> Export

image

2.) At the first step of the Certificate Export wizard read the information presented and click Next. The next step will allow you the choice to export the private key. Be sure to select No, do not export the private key.

image

3.) On the Export File Format screen, choose one of the options that results in an exported certificate in the .CER format. I chose the first option for my exported certificate.

image

4.) Provide a filename for the exported certification and click next. Review the options and click finish to complete the export of your certificate.

image

5.) To upload the certificate to your Azure subscription, first you’ll need to login to the Azure Management Portal. Once successfully logged in, open the Subscriptions menu and select Manage Subscriptions/Directory.

image

6.) On the Subscriptions Settings page, navigate to the Management Certificates section and click the Upload button on the bottom menu.

image

 

7.) Browse to the exported certificate created in the steps above and click the checkmark to upload the certificate.

image

Configuring Azure PowerShell Environment on Local Machine

With the personal certificate created and uploaded to your Windows Azure subscription the final step is to add this subscription to your PowerShell environment and configure options for it’s default use.

To complete the following steps you’ll need to have the Windows Azure PowerShell cmdlets installed. You can choose to use either the Windows Azure PowerShell command line or the PowerShell ISE for this next task.

1.) Before you being entering PowerShell code, you’ll need to collect two pieces of information. First the Certificate Thumbprint. This is found by Opening the certificate created earlier, browsing to the Details tab and looking for the Thumbprint property. Copy all of the characters from the thumbprint and remove the spaces. Set this aside for a minute.

image

2.) The second piece of information you need to collect is the Windows Azure Subscription ID. This is found on the Azure Management Portal in Subscription Settings – the same page you used above to upload the certificate. Copy the SubscriptionID and set it aside for a minute.

image

3.) Back to your PowerShell window. First you need to create a variable to hold your certificate and execute an Azure cmdlet to add the Subscription to your local configuration. The following two lines of PowerShell code take care of this.

Note that the path to the certificate assumes that you’ve created the certificate the same way as above, so it’s stored in your Personal certificate store. Modify the code below if you have it stored in a different location.

$cert = Get-Item “Cert:\CurrentUser\My\<CertificateThumbprint>”

Set-AzureSubscription -SubscriptionName <EnterSubscriptionName>
-SubscriptionId <SubscriptionID> -cert $cert

4.) To verify the subscription was added correctly, run the following command. This command will return a list of all subscriptions configured on your machine.

Get-AzureSubscription

5.) With the subscription now added, you need to select it and optionally set it as default. If set as default, it will be active upon opening a new Azure PowerShell window – a great option if you want to be able to open a command prompt and begin administering Azure right away. To do this, use the following command.

Note, you can add the –default option to make this the default subscription as described above

Select-AzureSubscription –SubscriptionName <EnterSubscriptionName>

6.) Now that the new Subscription is created, selected, it’s time to test. You can run a command like the following to verify that your new certificate enabled subscription is working properly.

Get-AzureVM

or

Get-AzureAccount

Conclusion

With the certificates now installed and your PowerShell environment configured it will be easier than ever to administer your Azure environment. You can even import multiple personal certificates to your Azure environment, so if you have multiple machines you can configure them all for easy access to your cloud environment.

PASS Business Analytics Conference – Keynote Day 1 Recap

Pass Keynote panorama

The PASS Business Analytics conference has officially begun this morning. Yesterday wrapped with a successful round of pre-conference seminars and the Welcome Reception – the chorizo was pretty amazing, #thankslarock.20140508_145852565_iOS

I was honored to be asked to sit at a special reserved table for bloggers and tweeters to broadcast keynote announcements live to the cloud!  I was joined by many fellow bloggers and friends include fellow BlueGranite coworker Melissa Coates (b | @sqlchick)

Today starts off with a keynote presentation featuring PASS president Thomas LaRock as well as Microsoft superstarts Amir Netz and Kahmal Hahti.

Thomas is the first to take the stage and kick off the second PASS Business Analytics conference. Thomas welcomes us and urges us to take hold of our passions for data and use that to give back to the community. PASS is one of the avenues that we can use to give back to. Join a local user group, join a virtual user group, start a user group if one doesn’t exist near you. Get involved. PASS has grown to over 100,000 members and operates largely on volunteer’s hard work and effort.

John Whitaker from Dell, one of the Platinum Partner sponsors of the PASS Business Analtyic ocnference this year joined the stage to help give us some insights into Big Data and how it’s penetrating the mid-market tier.  Big takeaways from this session are

1.) Big Data Projects are not just for enterprise

2.) One of the largest primary factors of successful implementation is IT to Business alignment

3.) Most successful implementations currently have been Real Time Analytics and Predictive Analyticsly

Overall John’s message is great and it’s an important one to hear. It’s also one we’ve heard before. Business has several challenges with data analytics: data complexity, data volume, and budget. While these challenges aren’t new, Big Data tools (like Hadoop) can help address these complexities in new ways that weren’t possible with traditional data analytics tools (like SSAS).

Collaboration between business and IT is also a key to success, which again, is nothing new, but as both tools and data become more complex that collaboration becomes ever more important.

Amir Netz is a Microsoft Technical Fellow and one of those great enigmatic speakers that really expresses his enthusiasm for Excel and analytics through his presentation.

Kamal Hathi is a director of engineering for Microsoft BI. Kamal is in charge of of the project teams that have created great new tools like Power BI.

Amir and Kamal really took charge of the room and drove up the excitement by announcing some AMAZING new features coming to Power BI. Seriously, there was a LOT of information during this keynote. Here is a brief recap:

The product updates came fast and furious during the demo

  •   Updates to natural language queries to make it even easier for users to get access to data
  • The Field list is now shown by default in Power Q&A
  • SSRS will be available in Power BI this summer
  • Data will be able to be hosted on premise, but reports surfaced in the cloud. Direct connection possible, no data refresh necessary
  • Power BI iOS application will be available this summer
  • Interactivity in browser of hosted Power View reports. Users will be able to interact and edit hosted reports without downloading to local machine first
  • New chart types available INCLUDING TREE MAPS!!!
  • Natural language dashboard creation built into Power BI and Power Query
  • Forecasting algorithms built into Power View allow for easy forecasting of current data
  • New chart modification features that include the ability to combine charts, filter specific values out of one chart, move chart features from one to another.

20140508_155838863_iOS 20140508_160419111_iOS 20140508_160810155_iOS

    As you can see, there were a massive amount of updates announced today. I’m sure I missed one or two, so be sure to watch the Power BI site for updates, as well as the #passbac hashtag for live conference updates.

The keynote at today’s BA Conference opening included some great new information about PASS, the Business Analytics community, and great new features avaialble within the Microsoft BI ecosystem.

I’ll be back tomorrow live blogging/tweeting the keynote presented by David McCandless – data visualization expert. Watch this blog and @joshuafennessy on twitter for updates.

PASS the BACon: The SQL PASS Business Analytics Conference is almost here!

It’s that time again, the 2nd PASS Business Analytics Conference is starting this week, May 7th through May 9th.

This year, it’s being hosted in San Jose, CA the center of leading edge technology. This will be my first trip to Silicon Valley and I’m extremely excited to go. This conference holds a special place in my heart – while I love SQL Server and PASS Summit, the Business Analytics Conference really speaks to ME as a data professional. My background is in analytics, reporting, and enabling data insights – so this event is right up my alley.

In addition to attending this great conference, I’m also honored to be invited again as a speaker this year. I will be involved with two sessions this year:

· Has Big Data Killed the EDW? – I will be joining a panel of my prestigious peers in this session hosted by Stacia Misner (b | @StaciaMisner). We’ll be discussing the current state of the ‘data lake’ and helping to dispel some myths and rumors about what Big Data can do, and what it means for traditional data solutions.

· A Master Data Management Case Study: MDS and DQS – in this session I will introduce some MDM topics. We’ll talk about why MDM is important and a few different options to help with implementation. We’ll take a brief look at SQL Server’s MDM tools, MDS and DQS, and we’ll wrap up with a review of real world implementations. Attendees will get to see some details of how the solution was implemented, why certain decisions were made, and what lessons were learned.

I will also be Live Blogging/Tweeting both the Day 1 keynote with Amir Netz and Kamal Hathi as well as the Day 2 Keynote with David McCandless. I’m really looking forward to both of these keynotes. Amir always does a great job, and I can’t wait to see what he has in store for us this year.

David McCandless is one of those rare individuals that has mastered both of the analytic side and artistry side of data visualization. I’m really excited to hear his talk. I’ve been a fan of his visualizations for years and getting to see him present in person is a great honor.

I’ll be joined by fellow BlueGranite coworkers at the Business Analytics Conference.

Jason Thomas (b | @SQLJason) – DataViz You Thought You Couldn’t Do with SSRS

Javier Guillen (b | @javiguillen) – Business Insight through Cloud-Based Data Models

There is a great lineup of sessions in store for this week. I can’t wait for it to begin. See you there!

Categories: Uncategorized

Breaking into BI: Where to begin?

April 15, 2014 2 comments

This morning, I received some interesting news. A good friend of mine, who has been in IT for years mostly in the operations and technical documentation side, had signed up for the first test along the Microsoft BI MCSE path. How proud I was, you probably can never imagine. It also got me thinking: where do you begin nowadays?

8(ish) years ago, when I was just learning the Microsoft BI Stack, the starting point was pretty well defined: start with learning Kimball style star schema, then ETL with SSIS, OLAP with SSAS, and finally reporting with SSRS, Excel and maybe Performance Point M&A.

Now, where to begin? There are so many more options now, getting ramped up makes my head spin. Power BI, Cloud vs. On-Premise, Big Data, In-Memory. There are so many more way to “do BI” in the Microsoft world now than there were even just 10 years it’s staggering.

I think, for someone just starting out, building that understanding of Kimball Methodology is still really important. As much as we want to believe that PowerPivot and free-form designed models are going to rule the BI world, the fact of the matter is that the EDW is here to stay; It’s not dead.

What’s changed, however, is that someone starting out probably doesn’t need IMMERSION into the EDW world to break to Analytics like before. Knowing the difference between a Fact Table and Dimension Table will be a pretty good start to being able to develop a beginning Power Pivot model. However, Power BI simplifies the model development, deployment, and reporting path.

The above being said, I can’t imagine being able to really get a handle of the BI world without understanding star-schema design, so I think at some point within the first few months of train, immersion into star-schema design WILL be necessary.

So where to begin? I’ve been thinking about this for only a few hours at this point, and I want to open this post up to discussion. Here’s my thought of a plan to begin learning:

• Introduction to Analytics Section
o Intro to Kimball based Star Schema
o Basic Data Modeling with Power Pivot
o Analytic reporting with Excel and Power View
• Intermediate Analytic and Reporting
o Designing a full Kimball Data Mart
o Intro to data integration with SSIS
o Modeling star schema with SSAS Tabular
o Deployment of Analytics through Power BI and SharePoint 2013
• Expert Analytics
o Advanced Dimensional modeling topics (SCD, inferred members, etc)
o Analytics models with SSAS Multidimensional
o Designing an enterprise reporting environment (Excel, SSRS, PPS)

Of course, there are lots of other topics to consider in this list as well (that maybe didn’t exist before): Big Data (Hadoop, HDInsight), Mobile BI, Self-service model creation, cloud deployment, data mining, predictive analytics, etc…I think those topics are very important, however, I think if the goal is to become a Microsoft BI Expert, a foundation in the SQL Server BI Stack is key before jumping over to more advanced techniques.

Right now is an amazing time to be involved with Microsoft BI – it can seem like a pretty daunting task to jump in a start learning from square one, but the rewards are pretty great.

2014 Upcoming Presentations and a Discount Code

February 5, 2014 Leave a comment

Upcoming presentations and a discount code

2014 has come in like a tidal wave! We’re already in February, and I’m just barely recovered from the Holiday vacation.

I’ve been a busy bee here in Michigan, having presented at SQL Saturday Nashville on 1/18/2014 – Great job again, Nashville Team!! – and I’ve got another event lined up.

image

This weekend, on Feb 8th, 2014 I’ll be speaking at SQL Saturday #241 in Cleveland, OH! My  topic is Visual Analytics with HDInsight and Power View. This has been an area that I’ve been interested in, and playing around with it has proven to be very exciting. If you are a SQL Professional that is curious about Hadoop and Big Data this session will show how it all works, how to get HDInsight up and running quickly, and get started with Hive to begin transferring your SQL knowledge to a new technology. Finally, we’ll complete the circle with showing how to incorporate it with SSAS and use Power View to visualize the data. It will be an exciting and knowledge filled hour!

You can download my presentation materials ahead of time here. I hope to see you there!


image

Moving father into the future, I’ll be presenting again this year at the PASS Business Analytics Conference in San Jose, CA! This time, I’ll be showcasing Master Data Services (MDS) and Data Quality Services (DQS). I’ll be giving an overview of how the technology works, how it’s set up, and then show an example of how we’ve used it in the past to help clients manage Master Data efficiently.

If you’re considering attending this event, I’ve got a special discount code for you! Using my code BASB5V, you’ll be given a $150 discount to the conference! This code can’t be used with any other discount offers, and rates rise as we get closer to the event, so register now and use this code if you are ready to go!

I’m looking forward to both of these events. If you’re attending, I’d love to get a chance to chat with you. Happy analyzing!

Categories: Uncategorized

The Worst Bar Chart of 2014? I think not.

February 4, 2014 2 comments

Recently Karen Lopez (b | t) offered up a candidate for “Worst Bar Chart of 2014”. It’s really bad, but I don’t think is the worst, here’s why.

First, it’s not a bar chart. It’s a waterfall chart. Now, beyond that, there are lots of things ‘wrong’ with the implementation of it that I would like to address here.

Before I begin dissecting, a little explanation of a waterfall chart. At first glance, a waterfall chart can easily be mistaken for a bar chart. It has bars, and is a chart, but that’s where the similarity ends. Typically a waterfall chart is used to tell a story. A story of how data is moved from Data Point A to Data Point B.

clip_image002

A classic example is inventory. I give you, Exhibit A. Ok, this is an extremely simple example I made with MSPaint – the graphics tool of all power users – still, it should help to prove my point.

This chart, remember it’s not a bar chart, tells the story of HOW July’s ending Inventory Total made it to August’s Inventory Total. Green bars are increases, Red bars are decreases – data labels for clarity. At a glance, you can see if the store is receiving equivalent amounts of inventory as they are selling, etc. Pretty neat visualization I think.

 

That’s a waterfall chart in a nutshell, so let’s move on.

Looking at the example that Karen provided, I see a few things (or more) wrong with it. Here are the big points (and those easily fixable with MSPaint).

clip_image003

 

First, the colors. The blue portion of the chart is supposed to show movement. It doesn’t. Which direction does blue go in? One section it’s moving left, another, right. It’s also a starting point and it’s an ending point. Additionally it’s used to show the deficit created by the 2018 supply vs. demand. So, I propose modification #1 – new colors.

 

clip_image004

In this new example, green signifies an increase, red a decrease. The ending point of the red ‘bar’ is how we arrive at the Projected 2018 Supply starting at the 2008 Employment. (156 + 161 – 32)

Second, the data labels. In this example, the data labels are not related to the movement of the data. All of the labels are oriented towards the right terminator of the bar, but the right terminator is not always the ending position of the bars movement. Some quick rearranging of the data labels will help to make it a bit clearer that where each bar starts, and where it ends.

clip_image005

Finally, the last ‘bar’ on the chart doesn’t follow the same format as the others. It’s not showing movement. It’s showing the portion of the total that is a deficit. Additionally, the data labels here DON’T EXPLAIN THE DEFICIT BUT RATHER THE TOTAL. This is against so many good rules of visualization it hurts my head just thinking about it. To fix this final bar we would want to modify colors, change the data label to a scale (showing 475 being the total) and then highlight the deficit with the correct number in question as a data label.

There are additional confusions created by alternating the row label color (why?), but again, not easily solved with MSPaint, so just imagine how great it looks when labels are all one color.

Here’s how I would have visualized a waterfall chart using this data. What do you think? More or less clear than the original version?

In specific rebuttal to Karen’s post, do I think this is the ‘Worst Bar Chart of 2014?’ No. Mostly because it’s not a bar chart. Still, it’s a pretty awful chart and while I’ve shown a few ways it can be better it took WAY to much explanation for it to be an effective visualization. If you have to explain your chart in words, then why visualize it at all?

To reiterate one of Karen’s most impactful closing points “If your chart leaves viewers thinking ‘I’m not sure’ more than once, it’s not effective.” If you are in the business of publishing visualizations to the public, please show you charts to someone who doesn’t know the data as well as yourself. Their response will be good litmus test of whether your chart will be effective out in the wild.

Happy charting,

Josh

Categories: Theory, Visualization

SQL Saturday Kalamazoo – Pre conference announcement

SQL Saturday Kalamazoo is less than a month away! We are busily preparing the event to ensure a fantastic day of learning and networking.

One part of the event we have been working on is securing the Pre-conference information. We are now happy to announce two pre-conference options for SQL Saturday attendees. This is our first year holding pre-conferences and we are very excited to bring in Allen White and Eddie Wuerch to spend a full day diving deep into their topics. Interested in attending? See below for session details, and registration links.

image

Automate and Manage SQL Server with PowerShell with Allen White

This soup-to-nuts all day workshop will first introduce you to PowerShell, after which you’ll learn the basic SMO object model, how to manipulate data with PowerShell and how to use SMO to manage objects. We’ll then move on to creating Policy-Based Management policies, work with the Central Management Server, manage your system inventory and gather performance data with PowerShell. We’ll wrap up with a look at the new PowerShell cmdlets introduced for SQL Server 2012 and how you can use PowerShell to manage SQL Server 2012 in server environments including Windows Server Core. After this one day, you’ll be ready to go to work and able to use PowerShell to make you truly effective.

Register with Allen here

A Deep Dive Into Waits-Based Performance Tuning with Eddie Wuerchimage

Start with a simple proposition: a process is either working or waiting. You can tune the working part, but are you seeing the whole picture? There are many different resources on which your process could be waiting – a lock, memory, disk, CPU, and much more.  When a process must wait, SQL Server will log it. There are hundreds of different wait types, and they are a gold mine of data for finding and solving performance problems – and proving the changes worked. Eddie Wuerch takes his extensive experience as a speaker, trainer, mentor, and DBA in one of the largest and busiest SQL Server environments in the world and distills it into a collection of performance tuning topics for DBAs and developers tuning databases of all sizes. After attending this seminar, you will be able to gather wait stats and use them to zero in on performance issues affecting your databases. Stop guessing, start knowing!

Register with Eddie here

Please note that pre-conference registration IS separate from the SQL Saturday registration. Early bird pricing for Pre-conferences ends on 10/15/2013, so sign up now for the best pricing options! If you’re still looking to register for SQL Saturday to see Allen, Eddie, and 25 other great presenters, sign up today!

Categories: Uncategorized
Follow

Get every new post delivered to your Inbox.

Join 882 other followers