A Lap Around Windows Azure

The first session that talked about Azure was (not surprisingly) incredibly popular. Filled a room that looked like it seated 700+ people. Then filled an overflow room. And the second overflow room is pack with people standing at the back.

The sample app is a thumbnail generator. In other words, a function that could normally be provided by a Windows Service. Interesting that the 'simplest' scenarios is that one. Interesting that it seems to indicate that what you're deploying into Azure is a service with ASP.NET or Silverlight as the user interface

A couple of times now, Azure has been referred to as an operating system.

Currently, the storage abstractions are: blobs for user data, simple tables for service state and queues for service communications. But these abstractions are not intended to replace a database. Astoria seems to be the expected CRUD channel.

"Azure is an operating system" is getting to the point where it should become a drinking game.

The second demo is a live site, managing teachers in Ethiopia. The speaking actually asked that we not to go to the site because it is live and the the typical usage pattern doesn't include being hit by 1000+ rabid developers all at once. :) In what appears to be a common approach, the user interface is a Silverlight communicating with Azure through web service calls.

As of noon (PDT) today, you can go to http://www.azure.com and download the desktop SDK. And publish applications to the close. Currently usage is free, with restrictions. And there is no indication yet regarding the pricing model, although in the keynote, Ray Ozzie suggested that it would be 'competitive' and that there would be support for 'hobbyists'. So look to Amazon's EC2 for a rough idea and figure that you will have some low/no cost option for you to play with.

Azure and On-Premises Data

One of the more interesting questions that will arise from cloud computer in general and Azure specifically is how the connection with on-premises data will be maintained. Based on what I saw in the keynote, they appear to have the link to Active Directory well in hand. In other words, identity can be federated so that the Azure service receives the credentials and passes the authentication request to an Active Directory server running on your corporate network. The demo in the keynote made that process look incredibly easy. Demo easy, with no discussion over the potential security risks or configuration requirements for the AD server. But that's beside the point (for the moment).

My real question relates to the access required for non-identity data. How will a company's internal data be made available to an Azure service. It appears that the answer is going to include SQL Data Services, but I'll be interested in hearing more about the details. Specifically, the level of exposure and integration required to allow an Azure application to retrieve and update information that is available only through an internal network. That has the potential to be an interesting challenge and set up a confrontation between security and functionality.

Welcome to the new world, same as the old world.

Nothing like some new terms

Get ready to hear about the 'Fabric controller'. In a cloud computer environment, the fabric controller is the "O/S". It is responsible for managing the services that you have configured.

Modeling your services

The first step is to model your service. This means to define the attributes related to the deployment and execution of your service. This includes the channels and endpoints for your service (do a quick look to WCF for the definition for these terms). As well, you define the security model by identifying the roles and groups. This information is persisted as the configuration for the service.

Development Tools

The developing and testing of the service can be done using familiar tools (i.e. VS2008). There is no need to deploy to the cloud in order to test. There is also no requirement that the application be written purely in managed code. This piece of information is a bit of a clue as to what is going on under the covers. In other words, there is probably Windows running someplace.

The development environment 'simulates' the cloud computing environment. Once the application is completed, it can be 'published' to Azure. The 'package' (the bin output for the project) and the configuration file is sent to Azure. After a few minutes, your application is running live on the cloud.

Simple. At least for the "Hello World" application. :)

Windows Azure Links

If you're looking for a set of links that provide more info on Windows Azure, check out http://blogs.msdn.com/cloud/archive/2008/10/27/bookmarks-windows-azure.aspx

Ray Ozzie Keynote at PDC 2008

Today's focus is on the back-end innovations. The client conversation/demos will be done at tomorrow's keynote. In other words, this is the 'cloud' talk.

The content of the first portion of his talk is about the convergence of IT Pro and IT Dev functions. Basically, he is making a case for the need for cloud computing. Things like redundancy, resilience, reliability, etc. Nothing exceptionally new here. But he then branches to the idea that, rather than having the infrastructure within a corporation, perhaps it would be better to have the infrastructure hosted by someone who specializes in offering web functionality that supports millions of developers. I think I can see where this is heading :)

The new service is called Windows Azure.

That explains the 'blue' theme seen throughout the Convention Center.

One of the goals is to be able to integrate the service with the existing toolset. "And you can", says Ray. But there will also be a new set of tools aimed at assisting developers with this 'cloud design point'. After all, it's not quite the same as traditional Windows applications. The focus for a typical Windows application is 'scale-up' and not 'scale-out'. And to work properly with the cloud, the 'scale-out' model is a requirement.

Keep in mind that one of the benefits of the cloud computing is the ability to increase capacity by turning a dial that increase the number of 'servers'. For your application to work successfully in that environment, the manner in which you develop applications might change significantly. But the details on that piece will have to wait for tomorrow.

Now it's time for the demos. More shortly.

Traveling to PDC 2008

I'm writing this while at 35,000 feet, winging my way to PDC in Los Angeles. This is actually my first PDC so I'm looking forward to it. I've been to Tech Ed and Mix in the past, but have never made it to a PDC. How can anything bad happen while I'm surrounded by 7,000+ geeks?

There are a couple of areas where I expect to see some significant announcements. Some of them, such as the beta bits for Windows 7 and an early CTP for VS2010, are already widely anticipated. But there are likely to be more (and potentially more interested) announcements from across the company.

For example, I expect to hear some big initiative surrounding cloud computing. Aside from tools that will help developers take advantage of the technology, it wouldn't surprise me to hear about a service that competes with Amazon's new E3 service.

Another potential target for news is the Oslo project. There has been a bit of buzz (oooooo...alliteration) on this upgrade to Microsoft's service layer offering, but it will be interesting to see how Oslo is positioned, especially in relation to any cloud computing announcements.

Beyond the above, I'm sure there will be more surprises. From the number of sessions on the agenda, I expect that there will be some VSTS innovations. And my interest has been piqued by an MSR project called Pex that deals with automated white-box testing. I'll be live-blogging from the keynotes and the sessions that attend, basically commenting on anything that catches my ear. So stay tuned for some PDC details...at least, my take on them.

A Disappointment with PPTPlex

I did a presentation this afternoon on some of the basic functions of WCF. I had put a slide deck together using a new Microsoft Office add-in called PPTPlex. You can see demos of what this add-in does in the provided link, but basically is allows for a much more dynamic experience of going through the slides in a slide deck. As compared to the typical linear flow, PPTPlex allows you to easily jump from one slide to another with a couple of clicks up and down the slide hierarchy.

I was pretty excited to be able to put this technology to work in the real world - right up to the time when I started the slide show.

Most of the time when I do presentations, I use a split screen. That is to say that what is displayed to the audience is not the same as what I see on the laptop in front of me. The new Presentation Mode in PowerPoint 2007 helps me a great deal with working that way. I was expecting that, when PPTPlex was run as a slide show, I would have expected the same appearance, that being that the slide show (such as it is) would be displayed on the secondary monitor.

I was disappointed.

Now it may be that there is a setting that I missed that would have allowed this to happen. I will admit that I was standing in front of the class when I tried this, so the time allocated for exploration was limited. But I expected it to just work and it didn't. Sigh.

Now I haven't yet got back to a place where I can do some detailed investigation, but as soon as I do I will see if I missed something. I hope so, but I doubt it. More details as they become available.

Mind Mapping from Word 2003 to 2007

I recall my first experience with Word 2007...it took me 20 minute to figure out how to do a Save As. Who would have figured that the cute icon in the top left corner actually had functionality associated with it. :)

In a blog post from Chris Sells, I was point to a tool whose purpose is to bridge this gap. You see an on-line image of Office 2003 menu structure and by hovering an item, you are told where to find the same function in 2007. You can find the tool here. Check it out, use it and avoid the embarrassment that I went through.

Writing is done, so back to the blogging

The two or three of you who follow my blog with regularity will have noticed that I was dark for most of the summer. The reason was that I was in the process of writing a book. Co-writing, would be more accurate, but still long hours were spent pounding out prose on my antique Underwood. Okay, maybe not so much pounding, but writing a book does dry me out for writing blog posts.

The recent influx of posts would seem to indicate that the book writing process was finished. And indeed it is. In fact, my editor informed me yesterday that the files have been shipped off to the publisher for final processing and printing. This is a source of great cheer, as I can now rest easy that no additional requests for editing will arrive in my inbox.

For those of you who are interested, the book is the MS Press training kit for the Windows Communications Foundation exam. You can see what it looks like at Amazon. And feel free to buy multiple copies...they make great Christmas gifts [:)] 

Visual Studio and SQL Server 2008 Conflicts

I'm just passing along some information that has been making the rounds (I found it on Guy Burstein's blog).

If you attempt to install SQL Server 2008 on a machine that has Visual Studio 2008 installed, it will fail. The requirement is to have VS 2008 SP1 installed, an update that is still about a week away from release. And you need the 'real' SP1. The same problem exists with the beta for SP1.

So, for you developer/database people out there, it looks like at least a week of waiting to get the combination on a single machine.