Archive for the ‘Oslo’ Category

Oslo – It’s a Floor Wax and a Dessert Topping…

2987577318_11aa449e20_m

At this year’s PDC, Microsoft has finally unveiled a first look at their new modeling platform, code-named “Oslo”.

The first question that comes to mind before we can talk about Oslo is “What is modeling?”

Modeling in software development is nothing new. In fact it has been attempted in several forms before. In the past, software modeling was done with CASE tools that attempted to turn UML type graphical structures into code. In general, these tools were very expensive and for the most part were tied to specific software development paradigms created by the thought leaders of that time (i.e. Booch, Rumbaugh, Jacobson, etc.) One of the more popular tools, Rational Rose, allowed users to create UML diagrams which could then be used to generate code.

While all of these various modeling technologies captured the imaginations of developers at that time, they really failed to deliver on the promise of making the software development process more manageable and understandable. I remember trying out Rational Rose and bumping up against the following issues:

  • The impedance mismatch between general software modeling language like UML and the underlying software languages of that time (i.e. C and C++). It was sometimes difficult to express in UML specific C++ constructs and vise versa..
  • The graphical nature of the modeling along with the monitor sizes of that time made it difficult to get a good view of complex software solutions.
  • Also, software, in general, was much more tightly coupled in those days making it more difficult to model as well.

While I’m sure there are success stories that can be cited from this style of modeling, it hasn’t been part of what I would call mainstream development. These tools were just too difficult and expensive making it questionable whether they provided any real advantage.

When I originally was exposed to these tools, I was writing shrink-wrapped software that was written for a single PC. Since this time software development has gotten much more complex. Throughout the history of software development there have been new abstractions introduced to make things easier so that we could produce more robust software solutions. For example:

  • Assembly Language was introduced to make machine code easier
  • C made it easier to create more complex solutions that were difficult in assembly
  • SQL provided a common language that could operate on top of structured data
  • C++/SmallTalk and other Object Oriented languages helped reduce complexity by allowing encapsulation into objects
  • COM/DCOM/CORBA was introduced to make it easier to break software into reusable components
  • NET/Java and other runtime languages provided an abstraction on top of the underlying operating system
  • XAML provides a more declarative language abstraction that describes what is desired rather than the imperative steps to accomplish the outcome

Often these technologies were received with mixed feelings. Some could see their potential value while others talked about the loss of control and performance that the abstraction would bring. But with many of these, the increase in computing power and the new solutions that were made possible justified the move to greater and greater abstraction.

I know that speaking from my own personal experience; the software development process has gotten more and more complex especially with regard to large enterprise solutions which have almost become unmanageable. Now with SOA-type architectures we no longer have systems that are deployed and run in a monolithic fashion. Instead these systems are an interconnecting maze of services, not all of which are under your direct control. Systems have gotten very difficult to maintain and configure. It also seems that software has a much longer lifetime making it very difficult to understand its intent especially as new developers come on broad and the experienced developers that originally authored the code leave.

Microsoft realizes that this complexity is just going to increase as we look forward to the future where some software will be deployed both on-premises and in the “Cloud” as well as to many other devices. These future solutions and experiences will be very difficult to create and manage using the tools we have today. Raising the level of abstraction to the next level will be needed, which is what Microsoft is attempting to do with the introduction of “Oslo” at PDC08.

Oslo is quite an expansive offering from Microsoft that brings Modeling into the forefront. Instead of just modeling the software itself, Oslo has the much broader and ambitious goal of modeling the entire software development lifecycle, from inception and business analysis to design and implementation to deployment and maintenance. With Oslo, these models are living and breathing. They not only organize and support the development process, they are actually used to run the software and systems using runtimes that consume these models.

During the PDC08, Chris Anderson and Giovanni Della-Libera quipped that Oslo was a “…dessert topping and a floor wax…” (click here for SNL reference). This overarching view where Oslo is fully integrated into the software development process is what makes this project so ambitious.

In short, Oslo is composed of three primary building blocks:

  • Repository: In Oslo all of the modeling data is stored in a Sql Server repository. This data contains all of the entities being modeled. This modeling data is comprised of such things as: software entities, workflows, business analysis, hardware, stake holders, etc.
  • Modeling Language (“M”): Oslo comes with a new modeling language called “M”. This language is has several dialects:
    • M-Schema : This dialect is essentially a Domain Specific Language (DSL) for modeling storage, more specifically database storage. Using this DSL, one can easily create a database schema.
    • M-Graph : This dialect is a DSL for adding data to a database. This DSL enables a very simple way to populate a database.
    • M-Grammer : This is the most interesting of the three because it is essentially a DSL for creating DSLs. Creating languages is very complex and previously left for the realm of researchers. Using M-Grammer makes it much easier to create textual Domain Specific Languages that allow users a way to code in specific domains.
  • Modeling Tool “Quadrant” : To be able to consume, browse, create and change the vast amount of modeling data associated with an enterprise software solution, it was necessary for Microsoft to create a tool that would make this feasible, this tool is code-named “Quadrant”.  “Quandrant” is essentially a graphical front end to the Oslo Repository.

Microsoft showed off all three of these components of Oslo at the PDC in the various presentations and keynotes. It was very obvious that we were seeing the very beginnings of the last few years of effort. Many of the demos that were shown were fairly simplistic making it clear that there is still a considerable amount of work to do before Oslo is ready for prime time.

At the PDC there were five sessions devoted to Oslo. These were all very informative and very well attended. I was able to attend many of these live and the rest I have watched online. Here are the links along with a quick summary of each.

A Lap around “Oslo”

Presenters: Douglas Purdy, Vijaye Raji

WMV-HQ WMV MP4 PPTX

This presentation was a great introduction to Oslo touching on all of its components. If you have time to watch only one of these sessions, I would recommend this one as it gives you a general feel for the potential of this new modeling platform.

“Oslo”: The Language

Presenters: Don Box, David Langworthy

WMV-HQ WMV PPTX

This presentation discussed the details of the “M” language itself. They showed how this language worked to generate schemas, data and briefly touched on creating DSLs. They showed the “Intellipad” tool that was used to edit and create “M”. I’m sure that this Intellipad functionality will at some point be integrated directly into Visual Studio.

“Oslo”: Building Textual DSLs

Presenters: Chris Anderson, Giovanni Della-Libera

WMV-HQ WMV MP4 PPTX

This was a very interesting presentation that showed how to create new textual domain specific languages (DSLs) using “M”.  Language creation can be a very daunting task. I’m sure anyone that that studied computer science in college will remember how difficult it was to use LEX and YACC to create a language. “M-Grammer”, which is what this session discusses, makes it very easy to create DSLs. I believe that DSLs will be more and more prevalent in the future and will enable applications to be created quicker and more reliable.  Many of us deal with DSLs in our day to day work already. These common DSLs include XSL, (a language for transforming XML), SQL (a language for querying data), HTML (a language for creating web content), etc.

I highly recommend this presentation. To get the most out of it, you may want to watch the two previous ones I mentioned before watching this one.

“Oslo”: Customizing and Extending the Visual Design

Presenters: Don Box, Florian Voss

WMV-HQ WMV MP4 PPTX

This presentation drilled into the “Quadrant” tool used to explore the models stored within an Oslo repository. While it was obvious that this tool is a work in progress and still has quite a ways to go to achieve its goals, you can get an idea about how this tool makes it easy to view the enterprise in a very easy and ad-hoc way. This session goes into the extensibility of the “Quadrant” tool and how it can be tailored to fit very different types of users.

“Oslo”: Repository and Models

Presenters: Chris Sells, Martin Gudgin

WMV-HQ WMV PPTX

This presentation gets into the repository itself. The repository is built on top of SQL server and provides secure access to the model data stored there. This presentation touches on the many models (written in “M”) that are included with the repository. These existing models are used to model things like: Identity, Applications, Transactions, Workflow, Hosting, Security, and Messaging (just to name a few). Chris Sells goes into showing how models are compiled and loaded into the repository and well as how to access model data from the repository. In addition to this he also covers the core services provided by the repository, which are: Deployment, Catalog, Security and Versioning.

Other interesting posts on Oslo:

In addition to these videos, here are some other interesting articles that I have stumbled across on Oslo:

Introducing “Oslo”

Why “Oslo”?

Martin Fowler on Oslo

PDC08, from my perspective…

Welcom to the PDC08

Welcome to the PDC08

The PDC was a great conference showing much of Microsoft’s future vision for their products and platforms.

I would say that this conference could be broken up into several major areas of interest:

  • Oslo
  • Azure Cloud Operating System
  • Live Services
  • User Interface – Silverlight
  • Languages – C#, Dynamic Languages, F#
  • Windows 7

Overall, I was very impressed with Microsoft’s ability as a company to coordinate the efforts of many diverse groups and technologies throughout the company. It seems that many of the efforts and initiatives that Microsoft has undertaken are starting to come together into a common cohesive vision. That said, it was also evident to me that they still have a bit of work to do to bring all of this new technology together for prime time usage. Although the PDC is an event that usually occurs only every few years, I did hear a rumor that Microsoft has already announced a PDC09. This makes me believe that the timing of this PDC may have been a little aggressive. For many of the Azure and Live services sessions the presenter was in constant IM contact with the Microsoft datacenter. That tells me that there were lots of stability concerns regarding the products that had strong reliance on their cloud services.

My personal interests drove me to many of the sessions on their cloud-based services and Oslo. It was pretty well know that Microsoft would be announcing a cloud-based platform and Live Mesh client-side platform but many of the details were clouded in secrecy before the PDC. Oslo was also talked about a little before the PDC and was shown in a little more detail at the PDC.

Over the last several years Microsoft has been building datacenters at a record pace. During the 1st keynote address, Ray Ozzie talked about how Microsoft realized that in building up their own web-based internet properties that many of these activities were being undertaken by many large and smaller companies around the world. While many large companies could afford to build large datacenters that provide reliability, redundancy, fault tolerance, etc. Many not so big companies were struggling with this overwhelming task. Even very large companies were having trouble with scaling out their services to handle geo-location and fault tolerance. It was becoming clear to Microsoft that cloud based services to complement on premises software was needed. In his keynote, Ozzie recognized similar efforts by both Google and Amazon in this area.

With Windows Azure, Microsoft differentiates itself with cloud based services offered by both Google and Microsoft by offering a service that is:

  • Abstracted out to be completely elastic, by allowing computing power to by dynamically sized and scaled at runtime so that you can handle peak loads without paying for more than you need.
  • Geo-location, by being able to spread a cloud-based deployment across the globe.
  • Fully fault tolerant. All data and software in the cloud is located in several places in the cloud and always spread between servers at different locations.
  • An infrastructure that can easily be connected to on-premises software through an internet service bus.
  • And the list goes on…

With Windows Azure will come many interesting deployment tools that will make deploying and managing applications in the cloud easy.  Microsoft has created many new and interesting technologies to create what they call “Fabric”.  This “Fabric” is the abstraction that sits on top of the actual servers that are running inside their many datacenters.  You can think of this “Fabric” as an abstraction that is a few levels higher than virtual machines.

From all of the various sessions on Azure and cloud-based services it was very clear that the new world of cloud-based software was going to require us to think differently about how we architect, design and implement our software so that it is better suited to fit into a cloud-based paradigm which, in my opinion, is where things will be headed in the next ten years.  Many of these practices involve the proper decoupling of software as well as other practices that are just part of good software design.

As part of this new cloud-based initiative, it is clear that WCF and WF (Windows Workflow) are going to be two very fundamental enabling technologies. Up to the point, we haven’t seen too much usage of Windows Workflow but it is clear that this will be a large part of many cloud-based applications.  As part of this, Microsoft announced a new server product called “Dublin” which is an application server built using WCF services that front Windows Workflows.  This server product has many advances in Windows Workflow that make it an ideal choice for hosting workflow in the cloud.

There were also talks given on the new data technologies that expand Sql Server into the cloud.  These were dubbed “Sql Services”.  One of the new services is Sql Server Data Service.  This service provides a new way to retrieve data over the internet using REST-based protocols.  These new REST protocols provide ways to query and update data using standard HTTP verbs, such as Get, Post, Put and Delete.  These protocols are built on the idea that specific data resources are uniquely addressable using URIs.  REST based protocols are built to scale like the internet itself.  It was clear that in addition to SOAP based protocols, Microsoft was heavily investing in REST-based protocols as well.

I also attended a session on storing scalable data in the cloud. From what I could gather, one will need to rethink how data is organized in order to take full advantage over the scalability that the cloud offers  What was described was very close to what Amazon does with it’s data storage services.  Basically, there are three storage items in cloud based storage services: blobs, tables and queues.  From what I could understand, the table storage was pretty basic.  Much of the storage seemed to center around entities which play nicely into Microsoft’s Entity Framework (which was just released with .NET 3.5 service pack 1).  They alluded to providing more relational storage in then cloud in the future but none of the cloud storage options had any relational capabilities.  I would imagine that much of this is because they want to provide a massively scalable and reliable data platform and the relational aspects would make this task much more difficult (which is the same route that Amazon has appeared to take).  Not sure where relational cloud storage would fit it but I would think that they would need to have a story here.

In addition to recognizing how difficult it is for companies to write highly scalable software and the need for cloud-based computing, Microsoft has put a considerable investment into “Oslo” which is a software modeling technology.  In recent years, software has become more and more complex.  It is also recognized that there are many aspects of software development that needs to be coordinated. These aspects include business analysis, software architecture, software design and implementation, software deployment. etc.

Over the past 20 years, there have been many attempts to model the software development process.  Many of these attempts, like Rational Rose, have had very limited success.  With “Oslo” Microsoft has taken software modeling to the extreme.  This appears to be one of Microsoft’s most ambitious projects in recent years.  In “Oslo” all of the modeling is stored in a database repository.  On top of this repository there is a highly customizable user interface that is used to explore this repository.  The user interface is very interesting and provides very in depth views into a given software application and how it connects to other various applications and services.

In addition to providing a graphical view of software, “Oslo” also provides a new modeling language called “M” that can be used to model software.  “M” makes it easy to also create Domain Specific Languages (DSLs) to make it easy to create applications.

As I mentioned, all of the modeling is stored inside a Sql Server database repository.  Runtimes that are meant to drive WCF/WF etc. are then driven from the data stored in the Oslo repository.

While I was very impressed by what I saw in “Oslo” it appeared to be a long way off.  Done right, this could really change the way we write and think about software in the future although it will be many years, in my opinion, before “Oslo” will be able to make that type of impact.

Well…that’s the long drawn out overview of PDC.  I will add more posts that target the specific sessions I attended along with links to the online videos of those sessions.