Welcome to SQL PASS Summit Keynote on Day 2.  Today was another day of announcements made by Quentin Clark who is the Corporate VP for Microsoft who is in charge of the SQL Server database engine.

Quentin started his talk about how SQL Server 2012 will improve the way that SQL Server provides many 9s of up time with the introduction of Always On.  The StreamInsight feature now also includes an HA option which will allow you to keep the SteamInsight system processing all the time.

The first demo for todays keynote was my favorite SQL Server 2012 feature, Always On.  In the demo the showed a three node cluster which hosted the production application with the cluster in two data centers.  During the demo the presenter quickly and easily setup.  After setting up a readable secondary the Availability Group was marked within the Always On routing configuration as a readable secondary and the reporting services reports were configured to automatically route their connections to the readable secondary in order to keep the reporting workload off of the production SQL Server.  (I’ll be blogging more about this feature later.)

In the second part of the keynote Quentin talk about the ColumnStore indexes and how this feature will make running reports and data analysis much, much quicker.  Quentin then talked about how PowerPivot has been enhanced, specifically when implemented along with SharePoint which will allow users to create their own reports within the IT controlled platform of SharePoint (which is great unless you don’t have / can’t afford SharePoint).

The next demo was for a set of features which are the Data Quality Services (DQS) and Master Data Services (MDS) features.  DQS is a pretty cool feature which allows you to clean up data problems like incorrect lat/long based on the address of a location by using data from the SQL Azure data market place.  During the demo there were a couple of stores which were located in the middle of the water around Seattle.  By using DQS the stores location was able to be quickly and easily moved from the middle of the water to the correct location in Seattle.

Quentin then talked about the Fast Track program which allows hardware vendors to sell pre-packaged, pre-configured systems with SQL Server on them which can make it easier for customers to purchase their SQL Server servers.  Another great program that Microsoft has is the appliance solution where you can get a server from the vendor up and running in just a couple of hours ready for data to be loaded into it.

Microsoft introduced the HP Enterprise Database Consolidation Appliance.  This system is basically a private cloud appliance which is a rack of servers with 192 cores, 400 disk drives (supporting up to 60k SQL Server IOs) and 2 TB of RAM.  This can be purchased as a half rack or a full rack appliance.  As I understand this appliance this is effectively a Windows Hyper-V cluster where you can fire up VMs within the system quickly and easily.  Where the system really saves you is the ability to deploy as a single unit quickly and easily without having to manually test all the components as HP will be testing and certifying that the hardware is working correctly before shipping the unit.  From the support side of things you get a single phone number to call in order to get support for both the hardware as well as the software.

The next announcement for today was a traditional ODBC driver for the Linux platform in order applications running on non-Microsoft platforms that can connect to a SQL Server database without any real change to the application.  Quentin also announced that Change Data Capture was now supported for SSIS as well as Oracle (not really sure what this means or how this works).

Semantic search was the next feature which was demoed.  Semantic Search is used along with the new file table feature which can load files into the database through the file system, then Semantic Search can be used to read the files and not just make the values searchable but it can understand the data within the files so that you can find files based on specific searches but you can then find related documents to what you just found by using not just specific search terms but by using the original document to find the matches.

Another name change for this week is that project Juneau is now called SQL Server Data Tools.

You can now from within SQL Server Management Studio right click on a database and migrate the database directly to SQL Azure.  This is done by creating a  DACPAC (v2.0) and pushing it to the Azure with just a few clicks.  Another Azure feature which is now supported is the ability to backup local or Azure database to Azure file storage by creating what is called a bacpac which is then copied to the Azure file storage.  So for we haven’t seen a way to do this through T/SQL.  You can also use this technology to backup your Azure database to the Azure file storage then use SSMS to connect to the file storage and restore the database to a local SQL Server instance for dev and testing.

SQL Server Azure is now supporting federations on the database side of the Azure platform in order to dynamically scale the database platform quickly and easily through the management portal without any application changes.  With the next year end release of SQL Azure larger databases up to 150 Gigs will be supported as well as the ability to use a different collation (this is using contained databases).

Available today in CTP Microsoft has introduced the SQL Azure Reporting CTP as well as the SQL Azure Data Sync CTP.  The reporting CTP allows you run SSRS reports from the SQL Azure cloud.  The Data Sync CTP (which is the final CTP) allows you to easily sync data between a local instance and an Azure instance or from an Azure instance to a SQL instance at another cloud provider.

Hopefully you are as excited about some of these new features as I am, as some of this stuff just looks really, really cool.

Denny

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trust DCAC with your data

Your data systems may be treading water today, but are they prepared for the next phase of your business growth?