Tuesday, 25 November 2008

DDD7: Welcome to the cloud (Windows Azure)

Big thanks to everyone who turned up to my session at DDD7.  I really enjoyed doing the session, and hope you all found it useful.

The slides and demos are now available to download on my skydrive.

The session was also recorded, I will let you know the link when it is made available.

For those wondering in the end I did do a demo of a Silverlight application running in storage services (i.e no webrole / website).  I think I will record that as a screencast soon.

Friday, 21 November 2008

Silverlight 3 – 3D and GPU Support

I guess I am a little slow off the mark on this one but the new GPU Hardware Accelerated graphics and the 3D support just sounds great.  I can’t wait to see the Spectrum Emulator run under that :)

I really want to play with it now, the downside is I suspect we won’t see any builds until Mix09

See ScottGu’s Post about it.

Thursday, 20 November 2008

Azure Tables, Entities and Partition Diagrams

It's always nice to explain things with diagrams, no matter how bad the diagrams are :)

A picture of a Table

The following diagram summarizes how a table of NxtGenUG (a UK user group) Coordinators might look in a Table in Windows Azure Storage Services.

This diagram shows that Chris Hay and Allister Frost are coordinators for the Cambridge region of NxtGenUG, and that Dave McMahon and Geff Lombardi are coordinators for the Birmingham region of NxtGenUG.

I have scribbled with my crayons how this relates to tables / partitions and entities in Windows Azure Tables.

image

The key thing to note about this table is that regions are partitions (birmingham, cambridge). The row keys are unique within a partition (but not outside of the partition, Chris and Dave have the same row key). Combinations of Partition Key and Row Key are unique.

Scaled Partitions

The following diagram shows how this table might be scaled in Windows Azure using Table partitioning.

The key thing to note is that entities within a partition always remain together (i.e. reside physically on the same server).

However partitions may not necessarily reside together, hence why querying across partitions is slower than querying exclusively within a partition

image

Silverlight and Tables

So this is how the tables look in the back end of Azure. This is a great structure and I can see this is something that Silverlight Applications could heavily utilize.

A lot of Silverlight applications don't necessarily need the full blown power of SQL Servers, and really want to save simple data in the backend, and be able to retrieve.

I can see this table type structure really being utilized within Silverlight. The downside is that Silverlight can't access it directly (you need to go via a ASMX / WCF Service). As I have said before I'm not sure you want to be able to give direct access, however perhaps some sort of delegated authentication scheme could work?

I think this will require more thought???

Azure Presentation

So I’ve just done my first version of the “Introduction to Windows Azure” presentation (the same one I am doing at DDD, called “Welcome to the Cloud”). This was an internal presentation to a local company in Cambridge.

The feedback was superb (one chap said, it was a better presentation than I had seen at TechEd, which was very nice).

From my point of view, there is a couple of things that I want to change for Saturday (DDD7) however nothing particularly major.

One thing that disappoints me about Saturdays is that there is no Silverlight whatsoever in the day, I am now considering changing one of my demos last minute to have a some Silverlight :) , I'm not sure I can cope with a full day without Silverlight

Anyways, I look forward to seeing folks on Saturday (Silverlight or no Silverlight)

Wednesday, 19 November 2008

First presentation of "Welcome to the Cloud" tomorrow

If you are going to DDD7 and you plan to attend my talk entitled "Welcome to the Cloud", which is a talk on Windows Azure, you will be pleased to hear that you won't be the Guinea pigs for this talk.

Very luckily I get to do a run through tomorrow for a local Cambridge company called Sagentia.

This should allow me iron out any glitches hopefully in time for Saturday :)

This has been a pretty tough presentation to put together, learning a new technology + presenting on it in 2.5 weeks (evenings + weekends to prepare only).  I wish i was one of those lucky folks who got to see it before PDC but I didn't, so I really have had 2.5 weeks to prepare this (i.e. when i got back from LA).

It's nice to be finally be presenting it for the first time :)

Silverlight: and Windows Azure

In my last article I was discussing Silverlight and Amazon's new CloudFront service.

I wanted now to discuss what the options with Azure are?

Hosting your Silverlight application in Azure

You can obviously host your Silverlight Application by hosting your website (which includes your Silverlight app) in Windows Azure, but I thought I'd explore avoidin the website, similar to what i was suggesting doing with Amazons Cloud Front.

The good news is that you can actually host your html file, your silverlight .xap file (set the mime type correctly), you will even be able to interact with web / wcf services via clientaccesspolicy / crossdomain files. You can host your Silverlight application completely in Windows Azure today (subject to the terms of service).

There are some caveats however at the moment:

  • Your silverlight application won't be able to access private blob storage (can't sign the http request)
  • Your silverlight application won't be able to access table or queues
  • Haven't figured how to control cache yet (not saying it's not possible, I just haven't spent enough time to figure if it is possible)

To be honest I'm not sure I want my silverlight applications to access these services directly. Certainly not at the moment (wouldn't want my shared key in the wild). May'be with a good authentication method this would be attractive.

The big difference between CloudFront and Azure for this type of thing is:

  • Content Expiry / Caching
  • Terms of Service / Pricing
  • Low Latency due to multiple datacenters and good routing

As I said, I think when Azure is out this will be a non issue.

Silverlight: Amazon CloudFront

I’ve been looking at Amazon CloudFront today, and it seems to me that Amazon is aggressively going for the CDN market.

This really is a CDN (Content Delivery Network) solution (and it’s cheap). All it is a method of serving up assets (images, documents, html, even Silverlight) with low latency by using a delivery network (so if your user is in hong kong, the asset will be served from the hong kong server).

The good news is that it is ridiculously cheap. 17 cents per GB data transfer, and 1 cent per 10,000 requests.

Silverlight

It also seems to me (although I haven’t tried it yet), that if you wanted to host your Silverlight application within Amazon CloudFront you could do so. You could have an HTML page (hosted in CloudFront), assets (hosted in cloudfront), silverlight XAP file (hosted in CloudFront), and then you could even allow calls to a back end web service (hosted in azure, or your normal boring old web server) via cross domain policies.

UPDATE: This is confirmed as working, be careful with setting your Content Types, see Tim Heur's comments in the comment section of this post. Thanks Tim.

Azure

I think once Microsoft have setup their many data centers, I would expect a similar type of offering from Azure also. To be honest I don’t think Azure is too far away from this. All Microsoft need to do is provide multiple datacenters, allow us to control the cache for Blob Storage Services, and use the closest data center to server up public url’s. To offer this a separate offering from Azure’s point of view would be wrong (i think).

I think that the competition between Amazon and Microsoft in the cloud space is going to be really great, and will drive down the cost of hosting / scaling.

Monday, 17 November 2008

DDD7 – My Session Picklist

So I am looking over the agenda for DDD7 and I am trying to work out which sessions I will attend.

Slot 1

I am in 2 minds about this one, I am in-between “Top 10 WCF tips” and “Seperating REST Facts from Fallacies”.

I really want to see the session on REST but we are bringing that session to Cambridge, unfortunately I don’t often get to see the full presentation at Cambridge because we have the business of running the user group.

I also love WCF and don’t think there are enough sessions on WCF, hence why I would quite like to go to this one also.

It’s a tough choice.

Slot 2

I think I am going to go to “ASP.NET MVC – Show me the code”.  ASP.NET MVC is one of those technologies I have wanted to play with for a while and that I haven’t seen any full sessions on.  I know this sounds odd as it’s been around for a bit now, but there is so much new stuff, some things just get pushed to the side.  I also haven’t heard Steven Sanderson speak before, so that will be cool also.

I would like to see Network Admin one (Dave McMahon is a brilliant speaker), however I just can’t commit to a double session, also I want to look at Developery things rather than IT Pro things.

Again the Virtualisation session is tempting however the MVC one happens to swing it.

Slot 3

This one is a no brainer to me “ASP.NET 4.0”, I didn’t go to any ASP.NET 4.0 sessions at PDC, and Dave and Phil are great speakers, so this will be a superb session.  Actually this is one of the reasons I avoided ASP.NET 4.0 at PDC, because I knew that this session was coming up :)

Slot 4

This is a tough one, I have seen enough of Oslo that I want to see for just now, so I am going to skip over that.

I am not really that interested in testing (also I am sure that we will have Ben across to Cambridge to do that session also), so I think I will skip that also.

I am interested in the Cores one but Daniel Moth covers that area so well, that I think I will go to the “Trust me, I know what you want” session.  There are a few reasons, this isn’t a typical developer session, it’s about requirements.  I also haven’t heard Beverly speak so it’s always great to hear speakers I haven’t heard before.  I also think that there isn’t enough Girly Geek Speakers so I think I will come along to this one.

Slot 5

Errgh, this is a No brainer for me, since I am presenting “Welcome to the Cloud”.  I think there may’be a few annoyed folks if I went to watch one of the other sessions.

Finally

All in all, it looks like a good day ahead.  It’s quite an interesting looking DDD.  There is quite a varied choice of sessions that are not just development sessions.

There are sessions that are about IT Pro (2 slots), Requirements (1 slot), and Virtualization (1 slot). 

  • 3 out of 20 slots are IT Pro (Developer Admin Guide, Virtualization)
  • 1 Slot is about requirements
  • 1 Slot is on Linux (c# on ubuntu)
  • 4 talks are about ASP.NET / Web (ASP.NET 4.0, MVC, CSS, Scaling)
  • 2 sessions on concurrency
  • 2 sessions of Services / Protocols (WCF, REST)
  • 1 IoC container session
  • 1 WPF Session
  • 1 LINQ Session
  • 1 Azure Session (Welcome to the Cloud)
  • 1 Oslo Session

I think the following did surprise me though:

  • No Silverlight Sessions
  • No C# 4.0 sessions
  • No VB Sessions
  • No F# Sessions

One thing I think is clear, the UK community is pretty diverse (both attendees and speakers).  I say this because these sessions have been proposed and have been voted for. 

One thing for sure is that by attending sessions that are not necessarily straight technology sessions (like my one is), that you will gain an insight into different methodologies, technologies which will make you a better developer.

The other interesting thing to note about this DDD is that it seems to be about the now, not the future.  Only 4-5 sessions are about technologies that haven’t shipped (azure (cloud), 0.5 concurrency, 1 parallel, oslo, asp.net 4.0).  3/4 quarters of the sessions are about today.  Now I am not sure if this is because we had to propose sessions prior to the PDC, would this DDD agenda look different if it was in 3 months time?  Alternatively it could be because folks want to be focused on today, not tomorrow.

Anyways, DDD, I love it, and I am really looking forward to it, and I hope to see you there.

Saturday, 15 November 2008

Silverlight: Are sockets only suitable in intranets?

There was a big push by the community at large for Silverlight Sockets to which Microsoft responded by providing said functionality. However can you really use such a feature in an Internet application?

Sockets allow high performance real time data updates, very useful for messaging / real time feeds etc.

Port Range

The fact is that Silverlight Sockets requires the policy file to be served on port 943 and sockets only work on ports 4502-4534. This means that your customers must have those ports open on their Firewall. In a world of locked down security is this really likely? I think not!

Controlled Customer Base

Therefore I think it's fair to say that you can only really use such functionality either an Intranet environment, or to a small controlled customer base. If your product is a mass market product do you wish to prevent your customers from using your application, especially if it's something they may have no control over (strict IT managers etc).

Simple Mode and Advanced Mode

I guess one option would be to offer two levels of functionality, simple mode (for those without those ports open), and advanced (for those with the ports open) where advanced users can gain the benefit of the extra performance gains on the real time feed.

Other Alternatives

Use of ASMX / WCF Services provides an alternative to sockets, as does the Duplex Polling feature of WCF. Although these technologies are very good and will meet the requirements of most users, they probably do not provide the performance needed for some applications. They could be offered as options in the simple version of the application.

The Future

So whatabout the future? For this space I guess some folks would like to see a wider range of available ports (e.g. ones that may'be already opened on the firewall), another hope could be more performant WCF based solutions.

For just now, I would say think about your market, your users, and the experience you would like to give them and make the best choice from there

Thursday, 13 November 2008

Azure: Fabric + Fabric Controller

So I've been preparing for my session at DDD7 next week and since I couldn't find any suitable readymade diagrams of the Fabric and the Fabric Controller, I had to create my own.

I thought I would share it with the community at large (feel free to steal it, I happily give up any rights that I would be too ashamed to assert).

image

  • Purple Blob - Fabric Controller
  • Blue Blob - Server
  • Orange Blob - Virtual Machine
  • Yellow Blob - Role
  • Green Blob - Agent

SQLBits Session Video Online

So a few months I did a session at SQLBits called Useful Sql’y Stuff wot I learned.   I am pleased to say that the SQLBits team have made a video of the session available.

I really did have a great day, and I really enjoyed doing that session.  For me it was of the sessions I enjoyed doing the most (especially since I wouldn’t consider myself as a SQL Server Dev).

It was a lot of fun, and I got some really great feedback from it.

I warn you if you are a hardened SQL Dev, you won’t get a lot out of this session, but if you are a .NET Developer that does SQL Server, then there is a lot of information that should help you out.

Wednesday, 12 November 2008

Apologies to Silverlight.Net Community Feed

It looks like an update to Feedrinse (at the feedrinse side), seems to have cleared my filters (nice one guys).

This has meant my non silverlight posts have been making their way through to the silverlight community news section.

I have now went into my Feedrinse account and setup new filters to stop this.

I apologize for this happening (but this truly was not my fault), and I have now rectified the situation.

Tuesday, 11 November 2008

More Spectrum Emulator (blah, blah)

I really have to stop posting about the Silverlight Spectrum Emulator, however I have just watched This Week on Channel 9 (for last week), confusing I know, and just noticed the Spectrum Emulator has made it onto the show.

It is also listed in the Press Room for Day 1 of the PDC

http://www.microsoft.com/presspass/events/pdc/videos.mspx

I am totally blown away how well this has been received.

Anyways, I will stop talking about it.

I have a presentation on Windows Azure to build :)

Monday, 10 November 2008

Azure: Hotmail should move to Azure

Maybe it’s time Hotmail moved to Azure.

This is the error message I got when I tried to login from messenger

image

In Azure they would be able to just spin up some more instances to cope with the load :)

Silverlight: (and non Silverlight) PDC Podcast and Interviews

So I was at the PDC this year, and we (myself and John McLoughlin) recorded a podcast and a bunch of interviews for NxtGenUG

We have some of the biggest names being interviewed:

  • Brian Keller
  • Dan Fernandez
  • Martin Gudge
  • Chris Anderson
  • Don Box

Go check out the interviews:
http://www.nxtgenug.net/Podcasts.aspx?PodcastID=59

There is still some more to be made available in the podcasts, outstanding are (hopefully they will be available soon)

  • Daniel Moth
  • Shawn Burke (Silverlight Controls Fame)
  • Mike Swanson
  • Tim Sneath
  • Pablo Castro
  • The entire Workflow Team (Kenny Wolf included)

Sunday, 9 November 2008

Azure: Sometimes I think too literally

In my last post, I complained that I couldn't set the cache in blob storage

It has just occurred to me that the prefix to meta data is occurring in the storage client sample layer.

I wonder if the cache control was set manually in the request if it would be applied to the blob.

I will investigate and let you know on the progress.

Azure: Unable to control cache for Blog Storage

So I've been messing around with the blob service, which is part of Storage Services of Windows Azure.

No Cache

One of the cool things about this service allows you to publish a file (image, document, video) to the service and expose a public uri which you can view in a browser.

The following is an example of a url for an image that i have published to my development fabric blob storage in the images container.

http://127.0.0.1:10000/devstoreaccount1/images/4d963b77-f855-47f3-ab80-593cb4d632ac

Although you can do really cool things such as set the content type, unfortunately you cannot control the cache.

There is a Metadata property bag exposed (NameValueCollection), unfortunately any meta tags that you specify are prefixed with (x-ms-meta-), which means that you cannot set the cache-control manually.

Please allow us to do this as a feature, it would be really useful.

How do you pronounce Windows Azure

This is a very funny video which reflects upon how to pronounce Azure.

This is less controversial in the UK, as I think most people are agreed on how it should be pronounced in the UK, regardless of how it is pronounced elsewhere, a very British attitude :)

Unfortunately, for those folks who have ever listened to me present / had a conversation with me know, that I have a very messed up accent.  I basically pick up on whatever I hear most frequently at the time.  If you are in doubt of my nationality, I am in fact Scottish.

So my current pronunciation of Azure is different to the pronunciation I would have had prior to the PDC.  Whether this keeps up, only time will tell, may'be I will change back to the UK pronunciation.

Azure: Reading + Configuring, errgh Configuration

So there is a couple of things you need to do in order to correctly setup configuration in Azure.

Any configuration that you plan to use must be defined in the Service Definition and configured in the Service Configuration.

Service Definition

The following service definition defines that I will I have a configuration setting named "myWebRoleSetting", notice that I don't define the value at this point, just the fact that I will be configuring this value for the service.  This is defined in my Service Definition (ServiceDefinition.csdef)

<?xml version="1.0" encoding="utf-8"?>
<
ServiceDefinition name="HelloWorldWeb" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
<
WebRole name="WebRole">
<
InputEndpoints>
<!--
Must use port 80 for http and port 443 for https when running in the cloud -->
<
InputEndpoint name="HttpIn" protocol="http" port="80" />
</
InputEndpoints>
<
ConfigurationSettings>
<
Setting name="myWebRoleSetting"/>
</
ConfigurationSettings>
</
WebRole>
</
ServiceDefinition>


Service Configuration



Once I have defined my configuration setting, I need to specify the value to use at runtime.  This is configured in my Serive Configuration file (ServiceConfiguration.cscfg)



<?xml version="1.0"?>
<
ServiceConfiguration serviceName="HelloWorldWeb" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
<
Role name="WebRole">
<
Instances count="1"/>
<
ConfigurationSettings>
<
Setting name="myWebRoleSetting" value="Hello World 2"/>
</
ConfigurationSettings>
</
Role>
</
ServiceConfiguration>


Accessing the configuration setting from my application



So once i have defined and configured my configuration setting.  I will obviously want to access within my application.  This is accessed via the RoleManager.



The following sample shows how you access the previously defined configuration setting.



RoleManager.GetConfigurationSetting("myWebRoleSetting")


As said in previous posts, that if you want your application to run both in and out of the Fabric, then you will need to abstract configuration into a common class.


Silverlight: Not been blogging about Silverlight for a while

So if you follow my blog you might have noticed that I have not been blogging about Silverlight for a while.

The major reason for this has been that I have been presenting on various topics recently that have been taking up a major amount of time. This has meant that I have been a little less focused on Silverlight

For example, at the moment I am currently preparing to do a session on Windows Azure at DDD7 in less than 2 weeks time, so my complete focus at the moment is on Windows Azure.

Once this session has past then myself and my fellow NxtGenners are doing Silverlight Assault course again in January (in a few locations). So I would imagine in December / January, I will be back on Silverlight and hopefully combining Windows Azure and Silverlight in lots of different samples.

Azure: Why can't I use appSettings

I have to admit I didn't get the point of this at all, until this afternoon.  It's one of these experience things.

So during the PDC (obviously didn't pay enough attention), I took note that in the world of Azure, we were to using the new ConfigurationSettings section instead of appSettings.  Ok I thought, but Why??

Why appSettings are no good?

I thought I'd try a sample that used appSettings, to see if threw up an exception.  I discovered that both in the "Development Fabric" and in "The Fabric", I had no problem using appSettings, so why was I supposed to use this new ConfigurationSettings section, and use the RoleManager to access it?

The answer is pretty simple it turns out, it comes down to changing the appSettings at runtime.

Guess what, you have no access to your web.config / appSettings at run time, and therefore if you want to change the value, then you must deploy a new version of the application.

However at runtime you have access to the the ServiceConfiguration.cscfg file and therefore can make on the fly modifications (as shown below)

image

This is one of those situations, that if you plan to run your application both in and out of the Fabric, then you should have a Configuration Manager of some sort to allow you to switch between the Azure ConfigurationSettings version and your normal appSettings version.

Thursday, 6 November 2008

Azure: Logging in and out of the fabric

This article really ties two articles together.  In this article I talked about discovering whether you are running normally or in the Fabric, and in this second article I talked about logging in Windows Azure

The following line of code is how you write to a log within the Fabric (in azure)

RoleManager.WriteToLog("Critical", "I am a critical error");


If however you try to run this line of code when you are not in the Fabric (e.g. normal asp.net web application), you will get an Object Reference exception.



Therefore having a common logging class which checks whether you are running in the Fabric or running normally would be a pretty useful thing.



And you can achieve that by combining this post and the other 2 posts.

Azure: Invalid application domain name

So I got my invitation code for Windows Azure last night, so I thought I'd deploy my first service to the cloud.

I got a little stuck for a while, I kept getting the error message "Invalid application domain name".  It turns out that "silverlightuk" is not a valid name for my hosted service account.

In the end I went for chrishayuk.

So if you get this error message, you know you have picked a special name, and you need to pick another

Azure: Writing to Logs

So I've been messing around with logging of data when running web applications in the Fabric.

So it is fairly easy to write a message to the log

RoleManager.WriteToLog("Verbose", "I am verbose");


This fairly obviously writes a verbose message to the log.



Types of Logs



The WriteToLog method supports 5 types of logging.




  • Critical


  • Error


  • Warning


  • Information


  • Verbose



I think I would have preferred however if an enum was used rather than passing it through as a string.



If you pass an invalid log name, then it raise an exception



Default Level of Logging



The default level of logging for a service seems to be Information.  This means that all types of logging messages will be logged except Verbose.



If you want to switch your level of logging to include Verbose, then within the Development Fabric you can just set the level of logging via Tools -> Logging Level Menu



Screenshot of Logs in the Development Fabric



image



Critical Logs



It turns out that messages written to the critical logs will be notified to you via your preferred notification method.



At the moment my notification method is messenger + email.  I must try that out later

Azure: Am I running in the fabric

So one of the cool things about Windows Azure is that you can take an existing asp.net application, and just run it in the cloud (by providing the ServiceConfiguration and ServiceDefinition config files).

So we know that working in the cloud and working out of the cloud, can require different behaviours from our application.

RoleManager.IsRoleManagerRunning (namespace Microsoft.ServiceHosting.ServiceRuntime) tells us whether our application (role) is in running in the Fabric.

Azure: Running Visual Studio in Administrator Mode

So this is a little annoying, in order to use Visual Studio 2008 and Windows Azure, you need to run Visual Studio 2008 in Administrator Mode.

The reason it needs to do this, is it needs the elevate privileges to start the development fabric.

However if I already have the development fabric running, I would have thought it would be okay to not use administrator mode, nope not the case.

Ho hum, I hope this is something resolved in the future, as it's a little annoying

Wednesday, 5 November 2008

Azure: Received my Invitation Token

Yay, thank you Microsoft,

My invitation toke arrived for Windows Azure.

Tomorrow night, I will have trying out deployments :)

Azure: Configuring SSL in the local development fabric

If you are wanting to test your website in the cloud against an https endpoint, then there is quite a lot of messing around to get it working.

This is a very cool article which goes through this process with you.

Tuesday, 4 November 2008

Windows Azure: Development Fabric - Vista SP1 and Windows 2008 only

So before you download the CTP make sure you have Vista SP1 on your development machine (or Windows Server 2008) otherwise you won't be able to install it.

I've spoken to quite a few folks who have been complaining that it doesn't run on XP, and have been asking me why.

Part of the answer is pretty simple (i am sure there will be other reasons also), the Development Fabric is completely dependant on IIS 7. Therefore you need Vista or Windows Server 2008 (sorry).

CNN: 3D Hologram

So I am playing with Windows Azure and I happen to have the US Election Show on CNN.

I have actually watched a bit of TV history.

They projected a 3D Hologram into the studio, which is then transmitted on air.

I believe they used HD cameras all around the lady filming every angle.  This is then projected back to the studio.  The main cameras are synchronized with the remote cameras, so you can see the correct angles of the lady (as if she was really there)

So the on air presenter was having a conversation with the lady in the other studio, as if they were in person, very cool stuff.

It looks very cool, I can see this technology getting much better and used so much more in the future.

I definitely think once this sort of technology gets cheaper it will pave the way forward for video conferencing, expect to see these sort of tricks in UK television soon.

Azure: Spinning up new instances

So I want to test my cloud application and how it will react if I scale my application out.

So by default we run one instance of the service, so if I wish to I can test the number of instances in my development fabric, so I don't get nasty surprises.

To do this all I need to do is modify my Service Configuration file.

ServiceConfiguration.cscfg

To spin up more instances all i need to do is modify the Instances element's count attribute to the number of instances I require.

So in the config below, I have configured the number of instances I am using to 3.

<?xml version="1.0"?>
<
ServiceConfiguration serviceName="HelloWorldWeb" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
<
Role name="WebRole">
<
Instances count="3"/>
<
ConfigurationSettings>
</
ConfigurationSettings>
</
Role>
</
ServiceConfiguration>


The really cool thing is that can see (and debug) the multiple instances in my development fragment (and look at the logging).



I've pasted a screenshot below:



image

Azure: Development Fabric Service EndPoints

So I've been messing with Windows Azure, been having a lot of fun playing.

One of the interesting things that I have been looking at is the Service Definition File.

Cloud Service Project

So if you create a cloud service in Visual Studio 2008, it will automatically create a project which contains a Service Configuration file (ServiceConfiguration.cscfg) and a Service Definition File (ServiceDefinition.csdef).  These files are used to configure the service in the cloud

ServiceDefinition.csdef

So I've attached a copy of my Service Definition file for my ASP.NET Project which I have deployed to the cloud.

The key thing here, is that in my local development fabric, I can modify the ports to whatever port I chose.  This is helpful if I am testing multiple services on my local machine.

When deployed to the Fabric in the Cloud, I must use port 80 for http and for https I must use port 443 in the cloud.

I can also have multiple endpoints listening but however I can only have one input endpoint per protocol.

So I can have one endpoint for http and one for https, but not 2 for http.

Within the development fabric, as said before I can set the port to whatever i choose, however if that port isn't available it will find and use the next available port.  So for example if I try to use port 10000 (which is used by the Development Storage), it will find the next available port (10002 in my case).

So be careful, and check that you are using the port you are expecting to use on your development fabric

<ServiceDefinition name="HelloWorldWeb" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
<
WebRole name="WebRole">
<
InputEndpoints>
<!--
Must use port 80 for http and port 443 for https when running in the cloud -->
<
InputEndpoint name="HttpIn" protocol="http" port="80" />
</
InputEndpoints>
</
WebRole>
</
ServiceDefinition>

Azure: Development Storage for those without SQL Express

So if you are like me and you don't have SQL Server Express Edition installed, you might have a couple of issues with the Development Storage in Windows Azure.

If you launch development storage you will get some error about SQL Express edition not being installed.

So how do you get around this

First of all you need to find your DevelopmentStorage.exe.config file, it's located in ("C:\Program Files\Windows Azure SDK\v1.0\bin")

Open it up in notepad or Visual Studio, and change the database server instance (2 places).  I've bolded where you need to change in the sample config below.

If you then start up your development storage all should be okay, once more.

<?xml version="1.0" encoding="utf-8" ?>
<
configuration>
  <
configSections>   
    <
sectionname="developmentStorageConfig" type="Microsoft.ServiceHosting.DevelopmentStorage.Utilities.DevelopmentStorageConfigurationHandler, DevelopmentStorage.Utilities"    />
  </
configSections>
 
  <
connectionStrings>
    <
addname="DevelopmentStorageDbConnectionString"
         connectionString="Data Source=.\SQL2008;Initial Catalog=DevelopmentStorageDb;Integrated Security=True"
         providerName="System.Data.SqlClient" />
  </
connectionStrings>
 
  <
appSettings>   
    <
addkey="ClientSettingsProvider.ServiceUri" value="" />
  </
appSettings>
 
  <
developmentStorageConfig>
    <
services>
      <
servicename="Blob"
               url="http://127.0.0.1:10000/"/>
      <
servicename="Queue"
               url="http://127.0.0.1:10001/"/>
      <
servicename="Table"
               url="http://127.0.0.1:10002/"
                     dbServer="localhost\SQL2008"/>
    
    </
services>
   
    <!--
NOTE: These are preconfigured accounts with well known keys. The purpose of the
      authentication supported by the development storage is simply to allow you to test
      your authentication code. It has no security purpose.
      It is strongly recommended that you DO NOT use your actual storage account or key over here.
      These keys are stored unencrpted on disk and in SQL databases.
    
-->
    <
accounts>
      <
accountname="devstoreaccount1"
               authKey="Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="
               isAdmin="false"
               contactInfo=""/>
    </
accounts>
  </
developmentStorageConfig>
   
  <!--
 
Use this section to configure logging of  requests and  responses from the development storage
  (http://msdn.microsoft.com/en-us/library/ty48b824.aspx).
 
  The logs go to %LOCALAPPDATA%\DevelopmentStorage\Logs and are named after the date/time when
  the development storage was started.
 
  The default logging level is Critical. You can change this to Information or Verbose for debugging.
 
-->
 
  <
system.diagnostics>
    <
sources>
      <
sourcename="System.Net.HttpListener" switchName="default" >
        <
listeners>
          <
addname="defaultListener"/>
        </
listeners>
      </
source>
    </
sources>
    <
switches>
      <
addname="default" value="Critical"/>
    </
switches>
    <
sharedListeners>
      <
addname="defaultListener"
           type="Microsoft.ServiceHosting.DevelopmentStorage.Utilities.CustomizedTextWriterTraceListener,DevelopmentStorage.Utilities"
      />
    </
sharedListeners>
    <
traceautoflush="true"/>
  </
system.diagnostics>
</
configuration>

Azure: DDD7 – Welcome to the Cloud

So I am working on a new session for DDD7 for which the title is “Welcome to the Cloud”, however you could read that title just as happily as “Welcome to Windows Azure”, or “Introduction to Windows Azure”.

So if you haven’t guessed already this session will be all about Windows Azure.

I hope to give a good introduction to Windows Azure and get you started developing on the platform / SDK.

Hope to see you there

Monday, 3 November 2008

Silverlight: Spectrum Emulator at PDC Show Off

I promise to stop going on about the Silverlight Spectrum Emulator.

However, I was very proud that it appeared in the PDC Show Off, it didn't win (there were much more worthy entries), however it was great to see it displayed.

You can see the video of the emulator on channel 9

Big thanks to Brian Keller and Dan Fernandez for hosting the event and showing the video

Silverlight: Spectrum Emulator is in the Community Gallery

I've noticed that my Silverlight 2 Spectrum Emulator is in the Community Gallery now :)

Windows Azure: Pricing Idle CPU Time

My past few blog posts have really been focused on developing against the pricing model of Azure.

I don't really want to be thinking about cost efficiency but I do believe in a world of Azure, developers will be very aware of the cost of an application.

Looking at the current azure metrics, it seems that the following will be monitored (and eventually billed, although not in the CTP timeframe).

  • CPU time, measured in CPU-hours
  • Bandwidth for ingress/egress from the data center, measured in GBs
  • Storage, measured in GBs
  • Transactions, measured as requests likes Gets & Puts
  • To be clear the CPU time measured INCLUDES IDLE TIME, whilst this makes lot of sense for the CTP, as essentially each instance of a role is equivalent to 1 VM (against 1 core).  It doesn't make sense beyond the CTP.  This is also the same model that Amazon uses with their EC2 cloud solution.  Unfortunately with this model with Amazon CPU time costs $90 per instance per month (which is pretty expensive).

    I hope Microsoft doesn't follow this model as this can be pretty expensive for the average developer.  However I do get the feeling that the one VM per role model (as used in the CTP) will not be the same model used when it goes production.

    Hopefully Microsoft will look at a model which will combine a cost of running an instance + a CPU Utilization cost.  This would give a low cost solution for developers with services with a low utilization, i.e. we don't really want to be paying for idle instances.

    Sunday, 2 November 2008

    Windows Azure: Cache based Session Providers????

    So following on from my previous article regarding cost efficient code reviews, I am going to predict a potential new session provider model.

    In a world of Windows Azure we will be making more decisions for our applications based on cost. Session is an obvious target as the current session model in ASP.NET could cost us a lot of money (depending on Microsoft's pricing model).

    Windows Azure, Cache and Velocity

    So at the moment the only cache provider available that can realistically be used with Azure is to use Windows Azure Storage's Table Storage Service. There is already an ASP.NET provider supplied in the Azure SDK, (technically you could use SQL Services but this would be more expensive and unnecessary).

    So Microsoft have said they plan to extend the options available for Windows Azure Storage Services and provide a Cache Service (based on Velocity).

    This is likely to be cheaper and faster than using the Table Storage Service. If it's not cheaper (or at least the same price), I suspect folks will continue to use the Table Storage Service. This is a really good thing and will mean that we can reduce our costs further when using caching with ASP.NET.

    Cache based Session Provider

    So assuming that the cache service is cheaper than the Table Storage Service, it is logical that Microsoft will provider a Session State provider that is based on the Cache Service. If Microsoft don't create such a provider, then I suspect someone else will.

    The reason I suspect this is that, if it's cheaper to use the Cache Service than to use the Table Service, it makes sense to utilize the service for what is considered as volatile data. It wouldn't be hard to build such a provider for the session.

    In a web application where session data is frequently read and written to, if costs can be reduced using an alternative model then it makes commercial sense to utilize this.

    I guess the only case that such a model wouldn't be explored is if the pricing of the Cache Service and the Table Service were the same.

    Silverlight and Session Providers

    This also brings an interesting question for your application and Silverlight, i.e. is exposing the session via web services to your silverlight application, the most cost efficient model. I will try and think about how this affects Silverlight also in future articles.

    Cost efficient development is definitely coming and these are the sort of areas that will be explored.

    Windows Azure - Cost Efficient Code Reviews

    So with the advent of Windows Azure, I see that companies will put more stress on Code Reviews.

    This is a great thing for companies in general as I do believe that not enough companies invest enough time in Code Reviews.

    So why do i believe that Windows Azure will invest more in Code Reviews? I believe this because without code reviews, badly written applications could cost companies a lot of money than a well written application

    Reviewing for Cost Efficiency

    I suspect a new set of best practices and patterns will emerge with the ultimate aim of keeping costs low for applications. I suspect we will also see the development of new tools which will allow you to simulate loads and predict the cost of applications.

    In order to keep cost efficiency companies will require team leads to review code with a particular emphasis on cost.

    Code Reviews will be looking for:

    • Unnecessary use of SQL Services (which has a higher cost than Windows Azure Storage)
    • Appropriate use of caching (rather than just retrieving from the storage area blindly)
    • etc, etc

    Session State Example

    A good example of where you can use unnecessary costs in a web application is the session state.

    An ASP.NET application on each HTTP Request will retrieve the session for the current user. In a normal world this is not a big deal because we have already paid for our session state. I say we have already paid for our session state, as we either have an inprocess session, or a sql server session. There is no real cost for maintaining our session, the infrastructure is already there. We have paid for our SQL Database and we tend not to be charged by hosting companies for internal network bandwidth usage.

    In a world of Azure we cannot use InSession providers, because we cannot rely that requests will always be serviced by the same server / VM. Therefore we must keep session in Windows Azure storage (you could keep it in SQL Services but that would just frankly be frivolous). This is the only way (at present) we can guarantee to keep our session data with scale.

    The downside is that everytime we retrieve data from the session, it's costing us money (Microsoft has not released its pricing model, so we don't know how much yet). So we therefore we have a few considerations?

    Does the page in question use the session?

    If not switch it off for that page, this is a performance best practice anyways but one which is generally ignored. If you switch off the session for that page then you are potentially saving yourself bandwidth, storage, CPU costs (depending on Microsoft's pricing model).

    Is it appropriate to keep the data in the session?

    Lets say there is a piece of data that is potentially accessed by the user in the course of a session but it may or not be used? Is it appropriate to keep that bit of data in the session (especially if it's quite large)? To be honest that question that sort of question is still valid for todays applications but with a utilization cost model, it massively increases the importance of such questions?

    Conclusion

    Anyways this is just my rambling as I am sitting in an airport in Los Angeles, however i think these sort of questions will become very important in a world of Windows Azure.

    Developers be prepared to optimize applications not just for performance but now for cost.

    It will be interesting to see the guidelines, best practices and tools that get developed in this area.