Thursday, November 27, 2008

Will the real slim shady please stand up? ... (VMWare Lab Manager vs Microsoft Team Lab)

I've been nagging myself for awhile about why I felt that the offerings of Team Lab felt so familiar and I just remembered why.

Sometime around spring 2007 I had a peek at a product called "VMWare Lab Manager" to see if we could benefit from it when virtualizing our test lab environements, when never ended up going down that road but it had more to do with the fact the we didn't have the energy to introduce yet another product at that time.

Guess what though the products are extremly similar just take a look at one of the key goals with Lab Manager:

Capture and Reproduce Software Defects—Every Time
Enable developers and testers to quickly reproduce software defects and resolve them earlier in the software lifecycle—and ensure higher quality software and systems. VMware Lab Manager enables “closed loop” defect reporting and resolution through its unique ability to “snapshot” complex multi-machine configurations in an error state, capture them to the library, and make them available for sharing—and troubleshooting—across development and test teams.

I do believe that Microsoft have been heavily influenced by Lab Manager when designing their Team Lab SKU of Team System, as you can see in the architectural overview of Lab Manager below you get pretty much the same offerings in both products.

One thing that bugs me though is the fact that neither Micrsoft nor VMWare has put the effort into integrating Lab Manager with TFS which has been done with other ALM suites:
Integrate with Leading Test Management Tools
Enable users to access VMware Lab Manager seamlessly from within their preferred test management tools. Off-the-shelf integrations with Borland SilkCentral Test Manager and HP Quality Center allow users simply to select the desired multi-tier configuration and Lab Manager will do the rest — automatically provision the test environment, tear down the environment after a test is run, and capture the state of the application, test date and virtual machine configuration in the event of a failed test.

Maybe this is something for the TFS community to produce? I'm intressted if someone can provide a enviroment for testing with licenses for the VMWare stuff since I don't have access to them myself (if you are intressted in a codeplex project about this let me know).

So why am I writing this post? Well since the idea about eliminating the dreadful no-repro or it-works-on-my-machine scenario appeals to me, I wanted to make sure that people are aware of the fact that VMWare has an offering in this area as well. Also the fact that you don't have to wait until VSTS 2010 hits the streets is a big plus.

Another good thing is since Team Lab is going to work with both VMWare and Hyper-V virtualization any effort you put into working with Lab Manager today will easily migrate to Team Lab in the future. There is one drawback with the Team Lab SKU as it looks today compared to Lab Manager and that is the fact that it is very integrated with the new test client (codenamed "Cameo") and as far as I have seen there are no web based management for the Team Lab stuff (yet at least), personally I believe it should be split from the test management since in my experience it is not the testers that provision the lab environments.

No more "Death-by-PowerPoint" ... Or how to improve your presentational techniques

A while back I did my first public presentation at a conferance and I just didn't feel I managed to pull it off as well as I had wanted todo. So I started to search around for some material and ended up buying a bunch of books that I just finnished reading.

Since I'm personally comitted to not accidentally causing anymore "death-by-powerpoint" I'll keep posting about my sucesses and failures when ever I feel I have something to contribute with on the subject of presentation design and delivery.

Slide:ology by Nancy Duarte

This book is truely a must have for all us non-designers that still want to make our presentations memorarble for our audiences. Nancy Durate (one of the persons behind Al Gores successful climat change crisis presentation) has literally poured 20 years of knowledge into this book, the book it self is beutifully presented and is a joy to read.

It works splendidly as a reference book on your desk for when ever you need to be creative and put together your presntations. I believe her husband is very much right in this quote from the forward of the book:

...slide:ology is destined to become the desk reference for building effective presentations and is a must read for all who present...

For more information goto

Presentation Zen by Garr Reynolds

I must say that this book was truely a joy to read, I actually read it from cover to cover in on sitting (almost had to take care of the kids during the day so there where an breif period of none reading).

The book is all about a state of mind in my oppinion. The author does not present us with a method that we are to follow rigorously to be successful, rather it gives us various pointers on how to create good presentation and become a better presenter.

Some key take aways are:
Less is more, keep it Simple.

Go Analog turn off the computer and start with pen and paper.

Design matters and it is not the icing on the cake, it's the foundation.
A picture says a thousand words.

Put your self in your audience clothes why are they there?

If you are serious about becoming a better presenter you should get a copy of this book since it will most definitly inspire you. For more information goto

Beyond Bullet Points by Cliff Atkinson

This book is all about method and it gives you a straight recipie for creating presentations according to the bpp way, include a bunch of templates for getting started.

Altough it contains alot of really good ideas which I am sure I will use the next time I create a presentation, the book itself is a rather boring read and uses way to many words to get to the point. I ended up skimming the book instead of reading it from cover to cover. This is kind of sad since I believe the author could have conveyed his message in half the pages or less. It is still a book that you should have read if you are working with slide based presentations.

Some of the key ideas in the book has to do with:
Structure and how the brain processes information.

3 is a magic number.

Visual keys through the presentation and important.

Headlines and illustrations, keep the ammount of information on the slides minimal.

I wont go into detail about the method since I expect that the author would not like that, so for more information goto

Apart from reading the books presented in this posting you should start to hang around on a couple of sites as well ... for studying other presentation designs. is a great place for finding graphics for your presentations.

Friday, November 21, 2008

Test Lab Environment Automation In The Cloud (SkyTap Virtual Lab)

Yesterday when I was digging into some more details around Team Lab (the newest SKU of Visual Studio Team System) I stumbled upon something that I sure wish that I would have had access to when we first started to virtualize out quality assurance lab environment.

A company called SkyTap announced a product called Virtual Lab back in april 2008 (read about it here). The product aims to provide a virtualized lab environment in the cloud, this has some real potential and I will surely look into this as a platform for our future lab environment.

What I find most attractive in a product like this is three things:

1, It will ease the demand on the operations department when it comes in house expertise of virtualization technology (this is particularly needed for smaller and mid size companies who simply can't afford to staff that type of compentency).

2, The self service provisioning model where you add the needed resources you want as you go and simply pay for what you use. No more large captial expenditure requests we can instead transfer the costs for this onto the running operational costs. This is also a very big deal for agile teams in my oppinion, since one of the problems we have is the fact that the preasure on the resources are very high on and off when running multiple agile teams where all the teams have a dire need for their own environment.

3, We get a library with baselined virtual images which will save us tons of time in configuration and trying to create these our selves.

Otherwise the product is very similar to VMWare Lab Manager, it allows us to create labs consisting of multiple machines. We can easily create snapshots of the environent when ever we find a bug and attach a link to that snapshot to our defect report, which the developer later can bring back to life to investigate the in the same environment as the tester. We get access to a REST based automation API for our lab environments and much more.

Another intressting tidbit is the announcement they made at the PDC08 about their integration with TFS. Nothing fancy but the have a custom control which we can embed in our work item type forms that will show the available snapshots so we can get the link straight in the Visual Studio IDE and double click it to get to it (if you want to get a peak at how it look watch this screencast)

Tuesday, November 18, 2008

Clueless about Azure? Grab the Azure Services Training Kit and give it ago!

Just stumbled across the Azure Services Training Kit which looks really nice as a starting point to playing around with the Azure Platform:

The Azure Services Training Kit will include a comprehensive set of technical content including samples, demos, hands-on labs, and presentations that are designed to help you learn how to use the Azure Services Platform. This initial PDC Preview release includes the hands-on labs that were provided at the PDC 2008 conference. These labs cover the broad set of Azure Services including Windows Azure, .NET Services, SQL Services, and Live Services. Additional content will be included in future updates of this kit.

Download it and start playing...

SQL Services: Codename "Huron" - Sync Enabled Cloud Data Hub

As I mentioned in my previous post the Microsoft Sync Framework and the SQL Services guys has teamed for some cool projects for the cloud.

Codename "Huron" which is one of them seems to be the answer to one of my intial questions I've been thinking about, namely the ability to maintain an application on-premise and using the Azure Platform for extending this application to handle peak loads or simply slicing of parts to run in the cloud.

As you can see in the picture "Huron" sits in the cloud acting like a master data hub allowing us to easliy build speaking to the local sync providers that either ships with the project. Currently they have build one for Access and SQL Server Compact but this will be extended to include SQL Server as well (as you can see in the quote below).

Leverage the power of SQL Data Services and Microsoft Sync Framework to enable organizations and individual workers to build business data hubs in the cloud allowing information to be easily shared with mobile users, business partners, remote offices and enterprise data sources all while taking advantage of new services in the cloud. This combination provides a bridge, allowing on-premises and off-premises applications to work together. Using “Huron”, enable sharing of relational stores like Microsoft Office Access, SQL Express, SQL Server Compact, and SQL Server, enable B2B data sharing, and push workgroup databases to field workers and mobile users.

The driving technology behind this project is the Microsoft Sync Framework so if youe not entirely up to speed on that you could start by having a look at the following article, Introduction to the Microsoft Sync Framework Runtime.

The configuration of the syncronization process is highly flexible since you can decide which tables you want to put in the cloud and you will be able to autosync bi-directionally or put the syncronization on a scheadule.

Sadly the download like is not yet available on the project homepage, but as soon as it is I will start to take it for a spin which I would like to encourage you guys to do as well this is an important building block for allowing a smooth transition path in between on-premise and the cloud.

For more information about the "Huron" and other interesting SQL Services incubation projects be sure to keep tabs on SQL Services Labs

Sunday, November 16, 2008

SQL Services: Codename "Anchorage" - SyncToy Moves To The Cloud

The Microsoft Sync Framework and the SQL Services guys has teamed up to produce some rather intressting incubation projects the first one is "Codename "Anchorage":

We’re evolving the popular SyncToy application to enable much more than just file/folder synchronization between PCs! With this project, providers will be able to register and be discovered in a variety of sync groups including contacts, files, favorites, videos, as well as synchronization across services such as the Live Mesh,,, and more. Powered by the Microsoft Sync Framework - this E2E and hub for sync providers has value for both consumers AND developers...
This project aims to provide syncronization between services. It's a provider based model that allows us to create so called sync groups to allow for greater flexibility when dealing with multiple sources and heterogenous data. At first glance I looks like this should really be integrated with Live Mesh.

Unfortunately the download link is still not public so I guess we will have to wait and see what gives, I'll probably post some more on this topic once the bits are available.

For more information about the "Anchorage" and other interesting SQL Services incubation projects be sure to keep tabs on SQL Services Labs

Friday, November 14, 2008

Missing my custom icons for my desktop IE links (...hint they are called favicons...)

Finally I managed to get around to wrap my head around a little but really annoying thing that occured to me when upgrading to Vista on my laptop. After upgrading an restoring all my favourite website links on my desktop all the icons reverted to the big blue E which is the icon for IE, at first I thought it was related to the switch to Vista but it seems it's a IE7 related issue as you can read below.

Anyway I'm sure there are still a few more poor suckers like me that still haven't figured it out yet so I thought I share the joy by posting about it...

The reason behind this behavior is the following (you can read more about it here)

Because the shell asks for 48x48 icons, but favicons are 16x16. Stretching them would have looked bad. This decision was made late in the IE7 cycle. Many people have complained and we are considering a fix for a future release.
The remedy is really simple just do the following: Right click on your desktop and select the views menu and select classic icons and of you go (the picture below shows the menues in question):

Unfortunately the classic icons are smaller 16x16 and doesn't look that nice but I still prefer to have the customized icon of the site in question so I can find the link fast when looking at my desktop.

Thursday, November 13, 2008

Windows Azure an introduction and what will it mean to the corporate developer? (Part 4)

This will be the final installment in my post about the Azure Services Platform, this time I'm going to talk about some of the building blocks available (in my oppinion the more interesting ones) namely .NET Services (formerly known as BizTalk Services) and SQL Services (formerly known as SQL Server Data Services).

I will not talk about Live Services mainly because I haven't had the time to look in more details about it but I might get around to that later on since there are some interesting aspects to some of the offerings in there even for a more corporate centric application.

.NET Services - ServiceBus

This very cool piece of technology gives us a servicebus in the cloud which may change the way many companies will solve their integration scenarios in the future.

The servicebus is not about hosting your services in the cloud, but rather making on-premise services publicly available in a really easy way. Everything is based upon WCF so your previous investments into this technology really pays of, the only thing we have to do is to change a binding, so exposing a already existing service using the servicebus in really a oneliner.

In the current CTP release the focus lies on non durable communication, fourtunately Microsoft talks about implementing both durable multicast and something called anycast (which would be the first available subscriber will get the message). Personally I think the lack of durable multicast limits the servicebus somewhat in B2B integration since there are not so many scenarios when a partner or customer will be satisfied to only get their messages incase their apps are up and running. So we really need the durable multicast for this before usage will really pick up in this area.

Apart from providing a volatile multicast mechanism we also can easily expose an on-premise service through the bus even though we are behind a firewall (it even works with NAT). Another really cool thing is the magic that they are working to establish a direct connection (you can configure this behaviour) even though the caller and callee both are behind a NAT.

.NET Services - Workflow Services

This is another service that have great potential. Apart from providing us with an scalable hosting environment we should be about to constuct really elegant solutions using a workflow to orchestrate a set of services and providing new functionallity based on that. Or we simply want to massage the data somewhat before sending one or more versions of a message onto the servicebus.

Unfortunately in the CTP they only allow fully declarative workflows, that means no custom activities which will limit the use somewhat. The subset of activities is rather small and contains the basic controll flow stuff and a bunch of HTTP and XML helper activities that are new in the Azure Platform.

We will be able to host WF 3.5 and beyond and the deployment process is really a breaze, you simply rightclick on your workflow in Visual Studio to deploy to the cloud. Once you have your bits deployed you can manipulate the workflow types and instances through a management portal, unfortunatelly this portal is not suitable for large volumes of running instances and we are left (atleat as it looks now) to implement a better management client ourselves (luckily the management apis are avaiable to us).

.NET Services - Access Control

Provides us with a Security Token Service (STS) in the cloud that can provide fedrated security by providing integration with a wide variety of different source such as Active Directory, Live ID, Cardspace and in the future Open ID as well.

It works with the Servicebus, Workflow Services and SQL Services, providing us with a consistent access control model through out the breath of the building block services provided in the Azure Platform.

There is alot happening in this area with the "Geneva" framework and server offerings which I haven't had the time to drill down into (honestly security is a nessecary evil :) ... nah but it is not as fun as the workflow and servicebus stuff so getting around to details ain't on the top the list yet).

SQL Services

Aims to provides us with a database in the cloud, currently it is very similar to the storage offerings in Windows Azure with the main difference being that SQL Services are build upon SQL Server (as the name impiles). The way to get access to the data is through a REST API (or if you like ADO.NET Data Services).

The storage model available now is really a hierarchical model that looks like this ... At the top we have a authority which can contain one or many containers which in turn consists of on or many entities that are a collection of typed name value pairs. Right now we are limited to the following scalar types: string, binary, boolean, decimal and dateTime.

In the future we will get support for more SQL Server functionallity such as reporting, analysis and much more.

Tuesday, November 11, 2008

Windows Azure an introduction and what will it mean to the corporate developer? (Part 3)

This part of the post will talk a bit more in detail about Windows Azure. The pictures I use in this post comes from the excellent white paper Introducing The Azure Services Platform written by David Chappell.

So what is Windows Azure? Microsoft them self defines it like this:

Windows® Azure is a cloud services operating system that serves as the development, service hosting and service management environment for the Azure Services Platform. Windows Azure provides developers with on-demand compute and storage to host, scale, and manage Web applications on the Internet through Microsoft® data centers.
Or simply put its an operating system for the cloud where the cloud basically a group of server (typically a large number of them). Initially it will be available as a service running in the datacenters of Microsoft, altough there where some hints that this might be made available for others at well to run in their own datacenters as well.
As the picture indicates the central piece in this is the part called the fabric controller which handles the automated service management and makes sure that you applications are provisioned in the way that you have specified.

These specifications are done using models that specify this such as topology information, health contraints, logical resources and so forth. As far I could tell they where not done using the "M" language yet but the intentions are that eventually all this will migrate into a cohesive whole. These models are then used to handle automated deployment and monitoring of your applications.

As you can se in the picture above there are two ways of deployment into Azure namely the Web Role (used for the public endpoints of you applications) and the Worker Role (used for async work normally triggered by either listening to the servicebus or polling a queue in the storage system).

In the current CTP release you can only deploy ASP.NET applications and .NET code, in the release version of Azure Microsoft intends to provide the possiblity to deploy PHP based application and support for unmanaged code as well. It worth to notice that the code is not running with full privelages but is running in a special sandbox mode that is similiar to what you get from todays application hosting environments.

Along with all these goodies we get access to a scalable file system as well, its not to be confused with the SQL Services which intends to provide a database in the cloud. Much in the same fashion though the access APIs are via REST interfaces and are accessible either direct from your code running with in Azure as well as your on-premise application. Very quickly you can decribe the different types of storage available like this:

Service Data provided by blob support much like a regular file in your on premise environment (in future versions we get support for filestreams as well).
Service State provided by tables, which aren't really table :) kinda confusing but it is simply a hierarchical structure consisting of enteties/properties/named and typed value

Service Communication provided by queues which are exactly what it sounds like a regular oldfashioned queue where the web role typically posts something that gets picked up by the backend worker role.

You access everything through a fancy web portal which looks nice in the CTP but leaves you with the impression that it will be a little painfull to deal with a major installation using the portal since it doesn't lend it self to well to large ammounts of data.

Finally one of the coolest things if your a dev like me is the fact that you get a development fabric which is a complete simulation model for Windows Azure and lets you test out your code in a distributed fashion before deploying to the cloud. Simply put you get "The Cloud on your desktop" fully integrated with your favourite IDE Visual Studio.

I'm really looking forward to getting hold of an account to actually trying out the bits for real, unfourtenatly I was one of those poor sods that had to attended the PDC via streaming on channel9 :) which means I have to wait a little longer (if by anychance anyone at Microsoft reads this feel free to help speed ut that process).

Monday, November 10, 2008

Windows Azure an introduction and what will it mean to the corporate developer? (Part 2)

The previous post about the Azure Platform was more along the positive vibe so this time I tought we would have a go at the negative stuff, or atleast with some of the concerns buzzing about around in my head. I am going about this from a perspective in how this could benefit the LOB applications that we produce where I work and the problems that we might occur whilst trying to incorporate the Azure Platform into our overall architecture, that said lets get down to business.

So when it comes to outsourcing in general (which applies in both SaaS and PaaS scenarios) we are confronted by the issues concerning trust. We need to have a really trusting relationship with our partners to be able to put parts of our business in their hands. Basically the trust issues boils down to two things as I see it Data and Availability.

Data, the concern here revolves around data ownership if we lock into on vendors storage solution it will become very difficult to move that data to another provider at a later point in time (I doubt that we will get much help from the vendors in a migration effort). Also regulatory issues are a big concern for many companies.

Availability, altough one has to remember that no system can really in practice garantuee 100% uptime there are simply to many fault factors in the equation. However I see one major difference which is that we are in control when a failure occurs on-premise (or atleast we like to think we are). Wether or not we actual are in control we do control the triage process of how we should go about to resolve the problem. I pretty sure that this procces will look completly different when left in the hands of a service provider (I'm guessing the ammount of dollars spent will affect the priorty you get and there is nothing wrong with that just simple economics).

Below you can find a few links concerning availability issues from the current players such as Amazon, Google and Salesforce and I am worried that we will see the likes for Microsoft as well once the load starts to increase for them:

Amazon EC2 & S3

Amazon Web Services Gets Another Hiccup
Amazon's S3 utility goes down
Google App Engine

Google's App Engine Breaks Down
Google explains why App Engine failed
Salesforce's hiccups down…again
Another issue that I think is even more important is the fact that in a multi-tenant environment we have issues with things such as resource exhaustion (it only takes one bad apple to spoil the bunch) in the Azure Platform they intend to tackle this with a configuration model that specifies things such as intended cpu load and average response times, using this information they will automate the process of scaling out the application when needed. However I still don't see this handling the poor suckers that end up on a machine with a bad app!

Finally moving into the cloud will have some effect on the way we write applications, many of the aspects we know and love from writing scalable solutions on-premise just gets even more critical. Things such as stateless execution and node affinity (or rather the lack of) will be absolutly nessecary to be able to handle provisioning when a catastrophic failure occurs. Upgrading your application will become much more difficult since you'll have to build your applications in a way that they have no downtime, therefore we have to be both forward and backward compatible in both the interfaces, implementation and storage schemas (and belive me if you don't have any experience in this area, it is hard to not break anything).

All in all I'm looking forward to tackle these issues in more detail when I get back from my paternety leave, altough I expect that I will not be able to drop the issue entirely before that and so I might write something more along the lines on what I think around the design considerations for using the Azure Platform in conjuction with an LOB appliction running on-premise.

Next part in this series will look abit more at the details concerning Windows Azure followed by the final installment taking a little closer look at the building blocks closest to my heart .NET Services and SQL Services.

Sunday, November 9, 2008

TFS Power Tools October 2008 are available for download

The October release of TFS Power Tools are now available for download here unfourtunately the are abit delayed since it's already November :)

You can read more about it in this post by Brian Harry.

My personal favourite this time is the Team Member feature which lets you interact with your team members straight from within the Visual Studio IDE. You can also do things such as viewing theire checkin history, shelvesets and pending work.

Windows Azure an introduction and what will it mean to the corporate developer? (Part 1)

I'm sitting here with a beer and listening to some nice rock music, the family is sound asleep and I figure I'd might as well start writing abit about my thoughts on what was the major theme of the PDC08 last week ... namely Azure Services Platform ... This post is probably gonna end up being a multipart posting since there is simple so much exciting new stuff to talk about in the and how it will change the way we write software on the microsoft platform.

The last couple of years there has been alot of discussion about delivering "Software as a Service" (SaaS), we have seen some successful attempts at this the most noteworthy is surely SalesForce delivering CRM software as a service in the cloud. More recently we have seen an evolvment of this into "Platform as a Service" (PaaS) being pioneered by Amazon with their EC2 (Elastic Cloud Cloud) and S3 (Simple Storage Service).

Update 2008-11-21: After being looking around alot more at the various cloud offerings out there and in respect to the comment below, I feel I should clarify my previous statement about Amazon EC2. It is ofcourse a IaaS that in it self is part of the Amazon Web Services (AWS) that is more correct to be calling a PaaS offering.

Well anyway last monday (October 27th 2008) Ray Ozzie announced the Azure Services Platform which is Microsofts step into the cloud computing arena.

As you can see from the picture the platform consists of four major parts where Windows Azure is the core component that everything else builds upon, so what exactly is Windows Azure then?

It's an operating system for the cloud. Normally when talking about the could we are talking about the internet but really Windows Azure is not limited to that, we could just as easily apply the technology behind Azure on any larger datacenter. It's all about efficient management of resouces and global scalability and reach. Apart from a hosting environment for our applications we get a new highly scalable storage system and a set of building block services:

.NET Services (previously known as BizTalk Services), will provide us which things such as federated acces control, an hosting environment for our windows workflows and last but not least a servicebus for the internet which will play an important role in what we can do in regards to integration between companies.

SQL Services (previously known as SQL Server Data Services), will provide us with a database in the cloud. Initially the offerings are rather limited and not that different from whats offered via the Azure storage system but we will get more and more capabilities here.

Live Services, these are no newcommers you get your basic stuff such as Live ID and Live Contacts. The really intressteing piece here though apart from Live ID is probably the newcommer Live Mesh which will provide a syncronization platform for syncronizing data between all your devices.

The platform also contains more traditional SaaS offerings such as Sharepoint Services and Dynamic Services. Well enough with the details (there will be plenty more of those in furture pats of this posting) and lets get down to what this can mean for people developing software. As I see it there a atleast three different users of this platform:


This is probably when a platform such as this really shines. Imagine all the creative people that can realize their ideas with just having to invest the time for realizing the code. We can skip the part where we have to build our own datacenter and staff it with expensive and hard to find people. Or once your in business you do that really expensive superbowl ad and get swamped with customer the next day, instead of building a datacenter for the worst case scenario we can just turn a knob and get some more juice ... you just gotta love it!

Small to Midsize companies

Similar to the upstar company we do not have to build up an huge it department for getting our it infrastructure in place we will just pay as we go. The main difference here is that we are most likely not going to have the need for the huge scale that the next facebook would need, in this segment it's all about TCO and operational cost as opposed to expenditure costs.


All the stuff we see for the midsize company applies here as well but I think there are some intressting scenarios here for producing hybrid solutions not only just putting parts of the application in the cloud. But maybe there are some possiblities for put load into the cloud based on expected flash load but still mantaining control of the application on premise (there will be several challenges with this and I will post more as I try this in real life).

Another really intressting idea is the fact that we will have more oppertunety to actually try out ideas since they will not incur the heavy expenditure costs for setting up an operational environment and thus we will be able to produce a real working application as a prof of concept which we then can bring inhouse if needed.

So when should we expect all these goodies? Microsoft are talking about a comercial release some time during 2009 which will include multiple datacenter with global distribution.

For more detailed information regarding "Azure" check out the following PDC08 sessions:

Or if you are short on time read the following articles to get a quick overview of what it's all about:

Another really cool thing about the whole Azure platform is that microsoft is vigliant about this being a cross platform environment there is plenty of talk about the ability to host other languages than .NET in future version, but we already have accessibility from Ruby (.NET Services for Ruby) and Java (.NET Services for Java).

Finally be sure to check out Googles App Enginge if you are serious about learning more about cloud computing. See you in the next part of this post when I'll talk about my concerns in regards to how this will play out for a corporate application architect.

Thursday, November 6, 2008

Papa's Got a Brand New Bag ... A quick look at Windows Workflow 4.0

WF 4.0 gives us a completely rewritten workflow engine! Personally I find it a little scary when Microsoft shifts a product around in this fashion fortunately the changes they are making are really promising and might be just what is needed to get the adoption of WF to really take off.

So what are we getting with this rewrite then?

* Fully declarative model it is now possible to write workflows totally composed using XAML.
* We get a new activity execution model that enables activities to have variables for storing data and arguments for passing data in and out from the activity. Basically it looks very much like a regular function signature with possiblity to have locally scooped variables within the function body (altough these variables are visible when walking down the parent/child chain sort of scoped global variables).
* Flowchart based workflows, which lets us get around some of the limitations when doing sequential workflow such as going back in the workflow after something had occured. Ofcourse this was possible to do using a statemachine workflow but not at all as cleanly as a flowchart would do it.
* The re-hosting of the workflow designer has gotten an real overhaul and gone form previously being a major undertaking guided by a 20+ page document to being a 4 lines of code experience.
* Totally rewritten WPF designer.
* Major performance improvments.
This is by no means all the stuff available in WF 4.0 but I'll have to get back with further postings after actually having spent some time with the bits. One thing that really bugs me though is the backward compatibilities with previous versions of WF, I'm worried that we will be subject to having to port our code by hand if we have not limited our workflows to strictly XAML based workflows (which isn't that easy to do in the present version).

If you want to get some more details on what's comming check out these session recordings from PDC08:

WF 4.0: A First Look
WF 4.0: Extending with Custom Activities
WCF 4.0: Building WCF Services with WF in Microsoft .NET 4.0

Wednesday, November 5, 2008

DropBox an alternative to Live Mesh

A while back when Microsoft announce there Live Mesh services I was really excited until I signed up for a Beta account and where informed about the sad fact that once again we poor swedes have to wait since it's a US only thing for starters...

Anyway I stumble upon an alternative called DropBox which offers a similar service for syncing and sharing files. It works in an hetrogenous environment with clients for Windows, Mac and Linux. You get 2GB for free and then you can upgrade to 50GB for $99/year.

However it is not a full replacement for Live Mesh since at the moment they have no public API (altough they are hinting that there will be one soon). and DropBox lacks support for mobile devices at the moment. Live Mesh also has the whole live desktop experience and deep integration with the other services in the live family.

Creating amazing presentations using a zoomable canvas (pptPlex)

I've been focusing more and more on how to create efficient and goodlooking presentations last year and recently I stumbled upon a really amazing addin called pptPlex which lets you create a zoomable canvas for your presentation.

The addin comes from the "Office Labs" team at Microsoft and lets you create a presentation where you can have an intelligent background that presents the bulk of you slides in an intelligent way giving the audience an overview of what the talk is about and then we can start zooming in to the various sections.

These presentations can also be very interactive since it becomes very easy to quickly jump between sections in your presentation without having to break out from presentation mode. Another really neat thing about this is that we can zoom in on the stuff in our slides so if we are presenting charts and such that are high on detail a quick mouseclick will let you blow up the numbers on the screen.

You could also use this technique to load up your presentation with all the esoteric stuff that you might or might not need and shove them away in a corner of the canvas, then if a question arises you quickly zoom in and bring up a slide about it.

If you want to see more of these kinds of presentations be sure to google on TouchWall which is multitouch based presentation screen which lets you do really cool presentations (altough the are not yet available to the average mortal) you can view a demo made by Bill Gates ealier this year here.

Monday, November 3, 2008

Finally a application server for .NET (Codename "Dublin")

I've been waiting for this since the day Microsoft announced .NET without providing a .NET specific host environment. We have been left to host our components in COM+ for several years now, this has work ok but in my personal opinion this has lead to a to tight tie into a technology that had been declared as a legacy technology. This fact has at least for us lead to a slower adoption pace of .NET than we would have had liked.

Anyway enough with the history let look forward. But before we do this lets resolve any issues concerning BizTalk. Microsoft are very clearly stating that "Dublin" is NOT a BizTalk Killer... BizTalk is still Microsoft solution for integration and will continue to be release on a bi-annual schedule as it looks now. Also there are no plans to add functionality to "Dublin" for rich transformations like BizTalk is capable of.

Dan Eshner held a very good session at the PDC08 called "Dublin": Hosting and Managing Workflows and Services in Windows Application Server about how "Dublin" works, I've taken the liberty of using some of the pictures in his slides. "Dublin" or Windows Application Server Extensions as which is the current official name, is a set of extensions that builds ontop of Windows Process Activation Services (WPAS/WAS) which lets us host both workflows (WF) and services (WCF).

The "Dublin" mantra is ... IT JUST WORKS ... and I must say that the stuff we saw at the PDC08 looks promising. One really neat feature that aligns very well with this ambition is the import/export feature, which lets us deploy our binaries along with the correct configuration with a simple click. Under the covers this feature uses a new tool which is already in Beta 2 called MSDeploy you can read more about this tool at the MSDeploy team blog.

As you can see in the picture above "Dublin" consists of:

A runtime database storing all the configuration and data concerning durable services as well as tracking and monitoring.
A management api built using powershell commandlets which makes it very neat to use in regards to operationa task since we can very easily script very complex scenarios. We also get a nice set of management tools that utilize these commandlets build into the IIS management console.
A set of services in the middle consisting of:
Hosting, a part from dealing with the actual hosting of the workflows and services we will get support for discovery protocols and an service that will look for orphaned service calls and restart them if a catastrophic failure occurs (I'm guessing this is a config option since it will require some design considerations when implementing the service in question for instance we would need to deal with the fact that the service needs to be restartable and can't leave partially finnished work).
Persistance, we get a new and improved persistance provider for our workflows which now is cluster aware. So we can have multiple boxes handling the same queues without stomping all over each other.
Monitoring, we get support for montitoring and tracing both workflows (WF) and services (WCF).
Messaging, this is supercool we get a built in forwarding service which lets us do things such as routing based on data in the message (much like the parameter propagation in BizTalk) and we also get support for message correlation based on data within the message payload.
Another cool management feature is the support for persisted instances which is very similar to the way BizTalk manages this for example we can view persisted instances that has failed grouped by exception and much more (see the webcast by Steven W. Thomas mentioned below for more details on how this works).

Nothing in the presentations at the PDC08 talked about build in declarative transactional support such as we are use to in the COM+ environment, however Dan Eschner confirmed in the Q&A that it is on the roadmap for the product but not in v1.

There is also integration with "Oslo" modelling initiative which will enable us to model our service configurations using M and then deploy them directly to "Dublin".

So when will we see this as RTM? At PDC08 they talked about being released about 3 months after Visual Studio 2010 which in turn has been indicated to be appearing around the end of 2009 (which would correspond nicely with how Microsoft has released VS previously) however these dates are purely speculative. It will be released as download for Windows Vista, Windows 2008 and Windows 7 and it will be a part of the operating system in future versions.

For more information take a peek at:

Steven W. Thomas of has produced a webcast which will guide you through how the "Dublin" management extensions in IIS manager will look like.
First Look at Windows Application Server (Dublin)
As always David Chappell has written a good overview of "Dublin" and a bunch of related stuff (this is a very quick introduction to the technologies it doesn't go very deep).
Workflows, Services, and Models -- A First Look at WF 4.0, “Dublin”, and “Oslo”
Finally I found a rather good FAQ like document at Microsoft which gives some more insights in what going on with "Dublin" in conjunction with .NET 4.0 (also a small document).
Riding the Next Platform Wave: Building and Managing Composite Applications

I hope this has given some insights into "Dublin" and look for further postings in the future since this is an area which I intend to dig deeper into.