Thursday, November 17

SQL Server 2005, Visual Studio 2005 and Biztalk Server 2006 Launch Event - London

On Monday, I attended the SQL Server 2005, Visual Studio 2005 and Biztalk Server 2006 Launch event in London.

1000 people crammed into the Novotel in Hammersmith to hear Bruce Lynn introduce the event and explain the theme of "Organisational Productivity". Previous products have concentrated on individual productivity, but the idea behind this release is to encourage the organisation as a whole to become more efficient by reducing cultural barriers to communication and allowing people to work more efficiently in teams.

This was followed by four more technical presentations. The first of these, by Maris Berzins, described Biztalk 2006. This is currently still in beta, but will be released in Q1 2006. The main enhancements seem to be a management console based on MMC, application level management and better integration with Sharepoint.

A demonstration showed how you can see workflows with the number of people at each stage of the workflow, and be notified if needed – e.g. if orders cannot be fulfilled because you are waiting for stock, you can take a business decision to order more stock. The business rules can be easily coded using .net programming languages.

Many business processes involve complex workflows, and can quite easily get stuck at a stage, for example, if the person responsible is on holiday. This could keep track of these and allow action to be taken to reroute as necessary. It is also easy to report on the situation, so if a particular stage of a process proves a regular bottleneck, it could be redesigned.

The second presentation, by Keith Burns described various ways in which SQL Server 2005 is more fault-tolerant than its predecessors. SQL server 2005 can work in part when parts of the system fail. You should put filegroups on different physical disks to get the benefits from this. Parts of tables can be partitioned between different filegroups.

Databases can now be mirrored to a different server for faster recovery when the database fails, without the need for hardware support.

There is better support for peer to peer replication, so that data can be updated locally, and propagated around your network without the need for a central distributor.

The presentation also described how concurrency was improved. This focussed on the fact that index rebuilds do not lock the tables, and there is support for versioning as well as locking, which improves performance by reducing the numbers of transactions in which deadlocks will occur.

When we upgrade to 2005, it seems like it's certainly worth considering filegroups a lot more than I have done in the past, and distributing files in a way that maximises both fault-tolerance and performance.

The third presentation, by Mark Quirk, described Visual Studio Team System.

(the most expensive version aimed at large development teams) and the tools within it to allow teams to work together. There are different versions of this for a number of roles (Solution Architect, Software Developer, Tester, and Infrastructure Architect). Views are available describing the status of the project for lower forms of life such as project managers and business sponsors. Typically these are created using Sharepoint. Microsoft seem to think that everyone has a job that fits nicely into one of those roles.

Each person in the process can see a view of the project from their perspective, and there are also business intelligence functions, e.g. to allow the numbers of bugs to be graphed.

There are two development methodologies built into the system, agile and CMMI, and rules can be set up to e.g. make source code conform to certain specifications before it is checked into the source control system (SourceSafe has been replaced). There is also basic unit testing functionality in the system.

The testing tools were then demonstrated and looked impressive. You could record a test script using a special version of IE. You can then change your script so that forms might read from a database (e.g. if you want to try and log on to a system multiple times with different accounts). You can set up tests to run in different browsers and simulate different network bandwidths. This could be used for load testing a system with 80% of users on IE, 20% on Netscape, 60% on broadband and 40% on dial-up connections. Graphs can be generated of overall response times, and this could be scripted to run regularly so that any performance trends could be understood.

The next part of the presentation looked at Smart Clients. Traditionally, there has been a trade-off between thick client applications, which give a rich user experience but are difficult to deploy and web-based applications which are easy to deploy, but have a less rich experience. Microsoft’s preference is to develop rich applications but greatly simplify the deployment process, so that when an application is running it automatically detects updates and updates itself. They seemed to prefer this approach to AJAX type technologies where applications run in a web browser but use JavaScript and web services to give a richer and more responsive user experience.

After that, ASP.NET master pages were discussed. Master pages are another way of creating pages with a standard layout. A content page inherits from a master page, which contains the headers and footers. The master page contains a tag, and the content page is inserted here. This looks marginally easier to use than traditional include files.

Finally, there have been some improvements in security, and it is easier to hide parts of a web site, such as options on a navigation menu that should only be visible to certain security roles.

I think the main thing I'll look at will be the testing tool. It's difficult at the moment to simulate many concurrent logins with different browsers and connections over varying network bandwidths. Testing could be scripted to run overnight, and response times and any problems could be emailed automatically to developers before they became too serious.

I'm less convinced about Smart Clients, however. I think AJAX type technologies are more able to deliver responsive systems that are easy to deploy. They also have the advantage of being more platform independent.

The final presentation, by Rob Gray and Mark Anderson, examined the business intelligence functions that are integrated with SQL Server 2005 Enterprise edition. The aim in SQL Server 2005 is to move this from being a tool that only a few high level strategic decision makers use towards being a more general tool.

There are 3 parts to this, Integration Services, Analysis Services and Reporting Services.

Integration Services replace the current DTS packages, and allow data to be cleansed as it is imported into a SQL Server database. “Fuzzy Lookups” were demonstrated; where records that are an approximate match with records in a database could be found (e.g. names that are spelled slightly differently).

Analysis Services aim to create a “Unified Dimensional Model” of a database that can be queried by OLAP type queries, and Reporting Services allow this to be queried and drilled-down on in various user friendly ways.

An impressive demonstration showed "fuzzy lookups" and how these could be used in the common situation where two database tables have tgo be matched, where the data is slightly different.

Overall, I felt there were many "quick wins" where introducing these technologies could provide a rapid benefit. I was less convinced by the team system functionality, as I believe teams should organise themselves to reflect the strengths and characters of their members, and not have roles and ways of working imposed on them. It looks like it's going to be an involved process for a smaller team to set this up for their own ways of working.

However, I'm looking forward to receiving my free copy of Visual Studio Professional and SQL Server standard in the post anytime soon, and I get the chance to see if it's really as easy to use as it is made out to be in the demonstrations.

No comments: