Skip to content

Feed aggregator

Analytics Platform System Update 4 Generally Available

Matt Usher, Senior Program Manager, DS SQL Engineering

We are pleased to announce the general availability of Appliance Update 4 for the Microsoft Analytics Platform System. The Microsoft Analytics Platform System is a turnkey big data analytics appliance, combining Microsoft’s massively parallel processing (MPP) data warehouse technology with HDInsight, Microsoft’s 100% Apache Hadoop distribution, and delivering it as a turnkey appliance. APS’s big data capability is powered by PolyBase, an integrated technology allowing customers to utilize the skillset they've developed in TSQL for querying and managing data in Hadoop platforms.

This update delivers several new capabilities and features, including:

PolyBase/Hadoop enhancements
  • Support for Cloudera 5.1 and Hortonworks Data Platform 2.1 and 2.2
  • Extended partial aggregate pushdowns into Hadoop
  • Grow HDInsight region on an appliance with an existing region
T-SQL compatibility improvements to reduce migration friction from SQL SMP
  • Scalar User Defined Functions – CREATE/DROP FUNCTION statements
  • PIVOT/UNPIVOT relational operators
  • Round Robin as a third table type for hybrid performance
  • Query plan improvements to favor hash joins over nested loop joins in DW scenarios for better query performance
  • Addition of a new DBCC command - DBCC DROPCLEANBUFFERS
  • SQL Server SMP to APS Migration Utility
Performance
  • 1.5x data return rate improvement for SELECT queries to improve external analysis integration
OEM Hardware on AU4
  • Dell, Quanta, and HP (Gen9) next generation server support
    • Intel Haswell Processors, 256 GB (16x16Gb) 2133MHz memory
  • Updated HP 5900 series switches including High Availability improvements
  • Mixed hardware version support allows extended appliance lifetime and expandability

Together with SQL Data Warehouse, Microsoft offers a choice of MPP data warehouse on-premises, in the cloud or both.

To learn more, please see the Microsoft Analytics Platform portal.

Related Link:
Microsoft Analytics Platform System Appliance Update 4 Documentation and Client Tools

Categories: Companies

Should our front-end websites be server-side at all?

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

I’ve been toying around with projects like Jekyll, Hexo and even some hand-rolled software that will generate me HTML files based on data. The thought that crossed my mind was…

Why do we need dynamically generated HTML again?

Let me take examples and build my case.

Example 1: Blog

Of course the simpler examples like blogs could literally all be static. If you need comments, then you could go with a system like Disqus. This is quite literally one of the only part of your system that is dynamic.

RSS feed? Generated from posts. Posts themselves? Could be automatically generated from a databases or Markdown files periodically. The resulting output can be hosted on a Raspberry Pi without any issues.

Example 2: E-Commerce

This one is more of a problem. Here are the things that don’t change a lot. Products. OK, they may change but do you need to have your site updated right this second? Can it wait a minute? Then all the “product pages” could literally be static pages.

Product reviews? They will need to be “approved” anyway before you want them live. Put them in a servier-side queue, and regenerate the product page with the updated review once it’s done.

There’s 3 things that I see that would require to be dynamic in this scenario.

Search, Checkout and Reviews. Search because as your products scales up, so does your data. Doing the search client side won’t scale at any level. Checkout because we are now handling an actual order and it needs a server components. Reviews because we’ll need to approve and publish them.

In this scenario, only the Search is the actual “Read” component that is now server side. Everything else? Pre-generated. Even if the search is bringing you the list of product dynamically, it can still end up on a static page.

All the other write components? Queued server side to be processed by the business itself with either Azure or an off-site component.

All the backend side of the business (managing products, availability, sales, whatnot, etc.) will need a management UI that will be 100% dynamic (read/write).

Question

So… do we need dynamic front-end with the latest server framework? On the public facing too or just the backend?

If you want to discuss it, Tweet me at @MaximRouiller.

Categories: Blogs

You should not be using WebComponents yet

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

Have you read about WebComponents? It sounds like something that we all tried to achieve on the web since... well... a long time.

If you take a look at the specification, it's hosted on the W3C website. It smell like a real specification. It looks like a real specification.

The only issue is that Web Components is really four specifications. Let's take a look at all four of them.

Reviewing the specificationsHTML Templates

Specification

This specific specification is not part of the "Web components" section. It has been integrated in HTML5. Henceforth, this one is safe.

Custom Elements

Specification

This specification is for review and not for implementation!

Alright no let's not touch this yet.

Shadow DOM

Specification

This specification is for review and not for implementation!

Wow. Okay so this is out of the window too.

HTML Imports

Specification

This one is still a working draft so it hasn't been retired or anything yet. Sounds good!

Getting into more details

So open all of those specifications. Go ahead. I want you to read one section in particular and it's the author/editors section. What do we learn? That those specs were draft, edited and all done by the Google Chrome Team. Except maybe HTML Templates which has Tony Ross (previously PM on the Internet Explorer Team).

What about browser support?

Chrome has all the spec already implemented.

Firefox implemented it but put it behind a flag (about:config, search for properties dom.webcomponents.enabled)

Internet Explorer, they are all Under Consideration

What that tells us

Google is pushing for a standard. Hard. They built the spec, pushing the spec also very hary since all of this is available in Chrome STABLE right now. No other vendors has contributed to the spec itself. Polymer is also a project that is built around WebComponents and it's built by... well the Chrome team.

That tells me that nobody right now should be implementing this in production. If you want to contribute to the spec, fine. But WebComponents are not to be used.

Otherwise, we're only getting in the same issue we were in 10-20 years ago with Internet Explorer and we know it's a painful path.

What is wrong right now with WebComponents

First, it's not cross platform. We handled that in the past. That's not something to stop us.

Second, the current specification is being implemented in Chrome as if it was recommended by the W3C (it is not). Which may lead us to change in the specification which may render your current implementation completely inoperable.

Third, there's no guarantee that the current spec is going to even be accepted by the other browsers. If we get there and Chrome doesn't move, we're back to Internet Explorer 6 era but this time with Chrome.

What should I do?

As for what "Production" is concerned, do not use WebComponents directly. Also, avoid Polymer as it's only a simple wrapper around WebComponents (even with the polyfills).

Use other framework that abstract away the WebComponents part. Frameworks like X-Tag or Brick. That way you can benefit from the feature without learning a specification that may be obsolete very quickly or not implemented at all.

Categories: Blogs

Fix: Error occurred during a cryptographic operation.

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

Have you ever had this error while switching between projects using the Identity authentication?

Are you still wondering what it is and why it happens?

Clear your cookies. The FedAuth cookie is encrypted using the defined machine key in your web.config. If there is none defined in your web.config, it will use a common one. If the key used to encrypt isn't the same used to decrypt?

Boom goes the dynamite.

Categories: Blogs

Renewed MVP ASP.NET/IIS 2015

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

Well there it goes again. It was just confirmed that I am renewed as an MVP for the next 12 months.

Becoming an MVP is not an easy task. Offline conferences, blogs, Twitter, helping manage a user group. All of this is done in my free time and it requires a lot of time.But I'm so glad to be part of the big MVP family once again!

Thanks to all of you who interacted with me last year, let's do it again this year!

Categories: Blogs

Failed to delete web hosting plan Default: Server farm 'Default' cannot be deleted because it has sites assigned to it

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

So I had this issue where I was moving web apps between hosting plans. As they were all transferred, I wondered why it refused to delete them with this error message.

After a few click left and right and a lot of wasted time, I found this blog post that provides a script to help you debug and the exact explanation as to why it doesn't work.

To make things quick, it's all about "Deployment Slots". Among other things, they have their own serverFarm setting and they will not change when you change their parents in Powershell (haven't tried by the portal).

Here's a copy of the script from Harikharan Krishnaraju for future references:

Switch-AzureMode AzureResourceManager
$Resource = Get-AzureResource

foreach ($item in $Resource)
{
	if ($item.ResourceType -Match "Microsoft.Web/sites/slots")
	{
		$plan=(Get-AzureResource -Name $item.Name -ResourceGroupName $item.ResourceGroupName -ResourceType $item.ResourceType -ParentResource $item.ParentResource -ApiVersion 2014-04-01).Properties.webHostingPlan;
		write-host "WebHostingPlan " $plan " under site " $item.ParentResource " for deployment slot " $item.Name ;
	}

	elseif ($item.ResourceType -Match "Microsoft.Web/sites")
	{
		$plan=(Get-AzureResource -Name $item.Name -ResourceGroupName $item.ResourceGroupName -ResourceType $item.ResourceType -ApiVersion 2014-04-01).Properties.webHostingPlan;
		write-host "WebHostingPlan " $plan " under site " $item.Name ;
	}
}
      
    
Categories: Blogs

Switching Azure Web Apps from one App Service Plan to another

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

So I had to do some change to App Service Plan for one of my client. The first thing I was looking for was to do it under the portal. A few clicks and I'm done!

But before I get into why I need to move one of them, I'll need to tell you about why I needed to move 20 of them.

Consolidating the farm

First, my client had a lot of WebApps deployed left and right in different "Default" ServicePlan. Most were created automatically by scripts or even Visual Studio. Each had different instance size and difference scaling capabilities.

We needed a way to standardize how we scale and especially the size on which we deployed. So we came down with a list of different hosting plans that we needed, the list of apps that would need to be moved and on which hosting plan they currently were.

That list went to 20 web apps to move. The portal wasn't going to cut it. It was time to bring in the big guns.

Powershell

Powershell is the Command Line for Windows. It's powered by awesomeness and cats riding unicorns. It allows you to do thing like remote control Azure, import/export CSV files and so much more.

CSV and Azure is what I needed. Since we built a list of web apps to migrate in Excel, CSV was the way to go.

The Code or rather, The Script

What follows is what is being used. It's heavily inspired of what was found online.

My CSV file has 3 columns: App, ServicePlanSource and ServicePlanDestination. Only two are used for the actual command. I could have made this command more generic but since I was working with apps in EastUS only, well... I didn't need more.

This script should be considered as "Works on my machine". Haven't tested all the edge cases.

Param(
    [Parameter(Mandatory=$True)]
    [string]$filename
)

Switch-AzureMode AzureResourceManager
$rgn = 'Default-Web-EastUS'

$allAppsToMigrate = Import-Csv $filename
foreach($app in $allAppsToMigrate)
{
    if($app.ServicePlanSource -ne $app.ServicePlanDestination)
    {
        $appName = $app.App
		    $source = $app.ServicePlanSource
		    $dest = $app.ServicePlanDestination
        $res = Get-AzureResource -Name $appName -ResourceGroupName $rgn -ResourceType Microsoft.Web/sites -ApiVersion 2014-04-01
        $prop = @{ 'serverFarm' = $dest}
        $res = Set-AzureResource -Name $appName -ResourceGroupName $rgn -ResourceType Microsoft.Web/sites -ApiVersion 2014-04-01 -PropertyObject $prop
        Write-Host "Moved $appName from $source to $dest"
    }
}
    
Categories: Blogs

Microsoft Virtual Academy Links for 2014

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

So I thought that going through a few Microsoft Virtual Academy links could help some of you.

Here are the links I think deserve at least a click. If you find them interesting, let me know!

Categories: Blogs

Temporarily ignore SSL certificate problem in Git under Windows

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

So I've encountered the following issue:

fatal: unable to access 'https://myurl/myproject.git/': SSL certificate problem: unable to get local issuer certificate

Basically, we're working on a local Git Stash project and the certificates changed. While they were working to fix the issues, we had to keep working.

So I know that the server is not compromised (I talked to IT). How do I say "ignore it please"?

Temporary solution

This is because you know they are going to fix it.

PowerShell code:

$env:GIT_SSL_NO_VERIFY = "true"

CMD code:

SET GIT_SSL_NO_VERIFY=true

This will get you up and running as long as you don’t close the command window. This variable will be reset to nothing as soon as you close it.

Permanent solution

Fix your certificates. Oh… you mean it’s self signed and you will forever use that one? Install it on all machines.

Seriously. I won’t show you how to permanently ignore certificates. Fix your certificate situation because trusting ALL certificates without caring if they are valid or not is juts plain dangerous.

Fix it.

NOW.

Categories: Blogs

The Yoda Condition

Decaying Code - Maxime Rouiller - 8 hours 24 min ago

So this will be a short post. I would like to introduce a word in my vocabulary and yours too if it didn't already exist.

First I would like to credit Nathan Smith for teaching me that word this morning. First, the tweet:

Chuckling at "disallowYodaConditions" in JSCS… https://t.co/unhgFdMCrh — Awesome way of describing it. pic.twitter.com/KDPxpdB3UE

— Nathan Smith (@nathansmith) November 12, 2014

So... this made me chuckle.

What is the Yoda Condition?

The Yoda Condition can be summarized into "inverting the parameters compared in a conditional".

Let's say I have this code:

string sky = "blue";if(sky == "blue) {    // do something}

It can be read easily as "If the sky is blue". Now let's put some Yoda into it!

Our code becomes :

string sky = "blue";	if("blue" == sky){    // do something}

Now our code read as "If blue is the sky". And that's why we call it Yoda condition.

Why would I do that?

First, if you're missing an "=" in your code, it will fail at compile time since you can't assign a variable to a literal string. It can also avoid certain null reference error.

What's the cost of doing this then?

Beside getting on the nerves of all the programmers in your team? You reduce the readability of your code by a huge factor.

Each developer on your team will hit a snag on every if since they will have to learn how to speak "Yoda" with your code.

So what should I do?

Avoid it. At all cost. Readability is the most important thing in your code. To be honest, you're not going to be the only guy/girl maintaining that app for years to come. Make it easy for the maintainer and remove that Yoda talk.

The problem this kind of code solve isn't worth the readability you are losing.

Categories: Blogs

Do you have your own Batman Utility Belt?

Decaying Code - Maxime Rouiller - 8 hours 24 min ago
Just like most of us on any project, you (yes you!) as a developer must have done the same thing over and over again. I'm not talking about coding a controller or accessing the database.

Let's check out some concrete examples shall we?

  • Have you ever setup HTTP Caching properly, created a class for your project and call it done?
  • What about creating a proper Web.config to configure static asset caching?
  • And what about creating a MediaTypeFormatter for handling CSV or some other custom type?
  • What about that BaseController that you rebuild from project to project?
  • And those extension methods that you use ALL the time but rebuild for each projects...

If you answered yes to any of those questions... you are in great risk of having to code those again.

Hell... maybe someone already built them out there. But more often than not, they will be packed with other classes that you are not using. However, most of those projects are open source and will allow you to build your own Batman utility belt!

So once you see that you do something often, start building your utility belt! Grab those open source classes left and right (make sure to follow the licenses!) and start building your own class library.

NuGet

Once you have a good collection that is properly separated in a project and that you seem ready to kick some monkey ass, the only way to go is to use NuGet to pack it together!

Checkout the reference to make sure that you do things properly.

NuGet - Publishing

OK you got a steamy new hot NuGet package that you are ready to use? You can either push it to the main repository if your intention is to share it with the world.

If you are not ready quite yet, there are multiple way to use a NuGet package internally in your company. The easiest? Just create a Share on a server and add it to your package source! As simple as that!

Now just make sure to increment your version number on each release by using the SemVer convention.

Reap the profit

OK, no... not really. You probably won't be money anytime soon with this library. At least not in real money. Where you will gain however is when you are asked to do one of those boring task yet over again in another project or at another client.

The only thing you'll do is import your magic package, use it and boom. This task that they planned would take a whole day? Got finished in minutes.

As you build up your toolkit, more and more task will become easier to accomplish.

The only thing left to consider is what NOT to put in your toolkit.

Last minute warning

If you have an employer, make sure that your contract allows you to reuse code. Some contracts allows you to do that but double check with your employer.

If you are a company, make sure not to bill your client for the time spent building your tool or he might have the right to claim them as his own since you billed him for it.

In case of doubt, double check with a lawyer!

Categories: Blogs

Run Configurations: debug any static method in Visual Studio, and more

JetBrains .NET Tools Blog - 11 hours 14 min ago

Sometimes when you’re writing code, you want to verify an assumption quickly, check how your API works, or simply prototype and execute a small part of your code.

What common options are available to us?

  1. Create a separate small project, put the code to verify into its Main() method, and add all the necessary dependencies.
  2. Hook the code to verify to a random button and start the debugger (and hopefully remember to remove the code afterwards).
  3. Write a temporary unit test and execute it with the Unit Test Runner.

Well, with ReSharper 9.2′s new run configurations there’s now a better way.

With the caret on a static method without parameters, press Alt+Enter to see a context action called Debug:

ReSharper's Debug context action

Calling this action triggers debugging, with this very static method used as an entry point. In other words, without breaking your coding flow, you can write a public static void SomeTestMethod() right where you’re coding and quickly launch the debugger.

Under the hood, this context action is based on a new feature, run configurations. The feature grabs information required to execute from the current project, creates an implicit run configuration, and actually executes it.

The Debug menu item can be expanded to reveal the entire set of available actions: Debug, Run and Profile (available if dotTrace is installed), each in two varieties: with or without building. Each of the expanded actions opens a dialog box where you can refine configuration parameters as necessary, and execute the configuration with or without saving it for the future.

Run or debug any static method without parameters

When you choose to save and execute, your settings are persisted so that you can reuse them later. There are three ways to access a saved configuration:

    • A pinned gutter icon next to a static method
      Gutter icon for a saved run configuration
    • The Run Configurations pop-up that is available by selecting ReSharper | Tools | Run Configurations or with a shortcut (Ctrl+Shift+Alt+R in both Visual Studio and IntelliJ keymaps)
      Run Configurations popup
    • If you have dotTrace integrated into Visual Studio, you can also access run configurations via its Performance Snapshots Browser (ReSharper | Profile | Show Performance Snapshots).
      Performance Snapshots Browser

    ReSharper 9.2 provides three kinds of run configurations:

    • Executable: Allows running any executable file.
    • Static Method: As suggested above, this lets you run any static method that doesn’t have parameters.
    • Project: Allows executing a Visual Studio project (which in essence emulates setting a project as startup and launching it immediately).

    You can add any of these configurations via the Add… submenu in the Run Configurations pop-up.

    In addition, there is a new item in the project context menu, Create Run Configuration, which creates a new project run configuration.

    The Run Configurations pop-up provides quick keyboard access to configurations. It allows you to execute any run configuration in any supported execution mode. Each configuration’s submenu provides a set of Configure commands.

    A run confgiration's Configure menu

    One of these commands, Select, deserves a notable mention. Select assigns the current configuration to launch when you invoke Visual Studio’s own Start (F5) and Start Without Debugging (Ctrl+F5) commands. In addition, two Alt-modified shortcuts are added which affect whether or not a solution is built before launch:

    • Alt+F5: Debug without building
    • Ctrl+Alt+F5: Execute a configuration without debugging and building

    A setting called Don’t build by default that concludes the Run Configurations pop-up serves to swap the default and Alt-modified actions. For example, it makes F5 invoke debugging without building.

    Other Configure options include:

    • Edit, Duplicate and Delete: Fairly self-descriptive.
    • Share/Unshare: Saves a run configuration to shared solution settings (instead of the default personal solution settings), which enables sharing the run configuration among all developers working on the current solution.
Categories: Companies

Installing Ruby on Windows Platform

C-Sharpcorner - Latest Articles - 12 hours 9 min ago
In this article, we will teach about how to install ruby on Windows operating systems.
Categories: Communities

Convert Word Document to PDF using Spire.Doc

C-Sharpcorner - Latest Articles - 12 hours 9 min ago
In this article we will learn how to convert Word Document to PDF using Spire.Doc
Categories: Communities

Office 365 - PowerShell Script to Verify User Already Exists or Not

C-Sharpcorner - Latest Articles - 12 hours 9 min ago
In this article I will explain you how to know if a user already exists and if not then creating new user to Office 365.
Categories: Communities

Exception Handling in MVC

C-Sharpcorner - Latest Articles - 12 hours 9 min ago
In this article you will learn about exception handling in MVC.
Categories: Communities

Different Approaches to Find Nth Salary in SQL Server 2008R2

C-Sharpcorner - Latest Articles - 12 hours 9 min ago
In this article you will learn about different approaches to find nth salary in SQL Server 2008R2.
Categories: Communities

Change Maximum Size of Upload Document File in SharePoint Server 2013

C-Sharpcorner - Latest Articles - 12 hours 9 min ago
In this article you will learn how to change the maximum size of a Upload Document File in SharePoint Server 2013.
Categories: Communities

New Azure SQL Database offerings add capabilities to scale data up and out

Tiffany Wissner, Senior Director of Data Platform Marketing

At Microsoft we’re committed to meeting the needs of our customers by offering choices as they build cloud applications for scale and performance. Today we’re continuing to deliver on this commitment with two announcements:

As more customers build more powerful applications in Azure using Azure SQL Database, we've seen strong demand for more options to scale performance. We’re dedicated to offering a portfolio of data services for cloud applications that effectively scales up to support data growth from more users, and scales out to support growth from more customers and fluctuating resource demands.

To meet this scale up demand, we are announcing the release of new Azure SQL Database Premium performance levels, P4 and P11. With the introduction of both P4 and P11, we’ve expanded the range of throughput performance developers can leverage to scale their applications. Developers can quickly start with our basic or standard performance tiers and easily scale database performance up to 350 times greater with our premium tier as their data needs grow – all without code or application changes.  This enables developers to match application needs with the right database performance.



On the scale out front, we have seen an increasing demand for Elastic Database pools since we announced its public preview.  We’ve seen SaaS developers adopt and embrace elastic databases because it makes it easier to manage their growing number of customers.  With Elastic Database pools SaaS developers can manage hundreds of databases for each of their customers, optimize the performance of each database, and manage the cost of these databases to support their explosive growth. This means their customers still get a great experience, even during peak usage periods, and there’s no need to over-provision resources during lulls.

One example is StudioPlus, which offers solutions for professional photographers. CEO Matthew Hunt says, “We are expanding our legacy on-premises solution with a SaaS solution. Azure SQL Database is a fundamental part of this journey because it allows us to leverage an existing code base and gives us a platform for unlimited customer growth. From a DevOps perspective, it’s simple to set up and manage Azure SQL Database Elastic Database pools to support the unlimited scale-out of a one-database-per-customer pattern.”



To support the increasing demand of Elastic Database Pools, we are now introducing a public preview of two additional service tiers for Elastic Databases, Basic and Premium.  These new tiers give organizations and developers with SaaS applications greater options and flexibility for getting started with Elastic Databases and scaling out to meet new customer demands.




We are thrilled that we can support StudioPlus and a growing number of developers on their SaaS journey with Azure SQL Database.

Our announcements today add scale and performance options, and these releases are just part of our end-to-end data platform. We continue to make it easier for customers to maximize their data dividends with our data platform and services. It’s never been easier to capture, transform, mash-up, analyze and visualize any data, of any size, at any scale, in its native format using familiar tools, languages and frameworks in a trusted environment on-premises and in the cloud.

Find out more and try Azure SQL Database and Azure SQL Database Elastic Databases today.

Categories: Companies

Announcing Great New SQL Database Capabilities in Azure

ScottGu's Blog - Scott Guthrie - Thu, 08/27/2015 - 17:13

Today we are making available several new SQL Database capabilities in Azure that enable you to build even better cloud applications.  In particular:

  • We are introducing two new pricing tiers for our  Elastic Database Pool capability.  Elastic Database Pools enable you to run multiple, isolated and independent databases on a private pool of resources dedicated to just you and your apps.  This provides a great way for software-as-a-service (SaaS) developers to better isolate their individual customers in an economical way.
  • We are also introducing new higher-end scale options for SQL Databases that enable you to run even larger databases with significantly more compute + storage + networking resources.

Both of these additions are available to start using immediately.  Elastic Database Pools

If you are a SaaS developer with tens, hundreds, or even thousands of databases, an elastic database pool dramatically simplifies the process of creating, maintaining, and managing performance across these databases within a budget that you control. 

image

A common SaaS application pattern (especially for B2B SaaS apps) is for the SaaS app to use a different database to store data for each customer.  This has the benefit of isolating the data for each customer separately (and enables each customer’s data to be encrypted separately, backed-up separately, etc).  While this pattern is great from an isolation and security perspective, each database can end up having varying and unpredictable resource consumption (CPU/IO/Memory patterns), and because the peaks and valleys for each customer might be difficult to predict, it is hard to know how much resources to provision.  Developers were previously faced with two options: either over-provision database resources based on peak usage--and overpay. Or under-provision to save cost--at the expense of performance and customer satisfaction during peaks.

Microsoft created elastic database pools specifically to help developers solve this problem.  With Elastic Database Pools you can allocate a shared pool of database resources (CPU/IO/Memory), and then create and run multiple isolated databases on top of this pool.  You can set minimum and maximum performance SLA limits of your choosing for each database you add into the pool (ensuring that none of the databases unfairly impacts other databases in your pool).  Our management APIs also make it much easier to script and manage these multiple databases together, as well as optionally execute queries that span across them (useful for a variety operations).  And best of all when you add multiple databases to an Elastic Database Pool, you are able to average out the typical utilization load (because each of your customers tend to have different peaks and valleys) and end up requiring far fewer database resources (and spend less money as a result) than you would if you ran each database separately.

The below chart shows a typical example of what we see when SaaS developers take advantage of the Elastic Pool capability.  Each individual database they have has different peaks and valleys in terms of utilization.  As you combine multiple of these databases into an Elastic Pool the peaks and valleys tend to normalize out (since they often happen at different times) to require much less overall resources that you would need if each database was resourced separately:

databases sharing eDTUs

Because Elastic Database Pools are built using our SQL Database service, you also get to take advantage of all of the underlying database as a service capabilities that are built into it: 99.99% SLA, multiple-high availability replica support built-in with no extra charges, no down-time during patching, geo-replication, point-in-time recovery, TDE encryption of data, row-level security, full-text search, and much more.  The end result is a really nice database platform that provides a lot of flexibility, as well as the ability to save money.

New Basic and Premium Tiers for Elastic Database Pools

Earlier this year at the //Build conference we announced our new Elastic Database Pool support in Azure and entered public preview with the Standard Tier edition of it.  The Standard Tier allows individual databases within the elastic pool to burst up to 100 eDTUs (a DTU represents a combination of Compute + IO + Storage performance) for performance. 

Today we are adding additional Basic and Premium Elastic Database Pools to the preview to enable a wider range of performance and cost options.

  • Basic Elastic Database Pools are great for light-usage SaaS scenarios.  Basic Elastic Database Pools allows individual databases performance bursts up to 5 eDTUs.
  • Premium Elastic Database Pools are designed for databases that require the highest performance per database. Premium Elastic Database Pools allows individual database performance bursts up to 1,000 eDTUs.

Collectively we think these three Elastic Database Pool pricing tier options provide a tremendous amount of flexibility and optionality for SaaS developers to take advantage of, and are designed to enable a wide variety of different scenarios. Easily Migrate Databases Between Pricing Tiers

One of the cool capabilities we support is the ability to easily migrate an individual database between different Elastic Database Pools (including ones with different pricing tiers).  For example, if you were a SaaS developer you could start a customer out with a trial edition of your application – and choose to run the database that backs it within a Basic Elastic Database Pool to run it super cost effectively.  As the customer’s usage grows you could then auto-migrate them to a Standard database pool without customer downtime.  If the customer grows up to require a tremendous amount of resources you could then migrate them to a Premium Database Pool or run their database as a standalone SQL Database with a huge amount of resource capacity.

This provides a tremendous amount of flexibility and capability, and enables you to build even better applications. Managing Elastic Database Pools

One of the the other nice things about Elastic Database Pools is that the service provides the management capabilities to easily manage large collections of databases without you having to worry about the infrastructure that runs it.   

You can create and mange Elastic Database Pools using our Azure Management Portal or via our Command-line tools or REST Management APIs.  With today’s update we are also adding support so that you can use T-SQL to add/remove new databases to/from an elastic pool.  Today’s update also adds T-SQL support for measuring resource utilization of databases within an elastic pool – making it even easier to monitor and track utilization by database.

image Elastic Database Pool Tier Capabilities

During the preview, we have been and will continue to tune a number of parameters that control the density of Elastic Database Pools as we progress through the preview.

In particular, the current limits for the number of databases per pool and the number of pool eDTUs is something we plan to steadily increase as we march towards the general availability release.  Our plan is to provide the highest possible density per pool, largest pool sizes, and the best Elastic Database Pool economics while at the same time keeping our 99.99 availability SLA.

Below are the current performance parameters for each of the Elastic Database Pool Tier options in preview today:

 

Basic Elastic

Standard Elastic

Premium Elastic

Elastic Database Pool

eDTU range per pool (preview limits)

100-1200 eDTUs

100-1200 eDTUs

125-1500 eDTUs

Storage range per pool

10-120 GB

100-1200 GB

63-750 GB

Maximum database per pool (preview limits)

200

200

50

Estimated monthly pool and add-on  eDTU costs (preview prices)

Starting at $0.2/hr (~$149/pool/mo).

Each additional eDTU $.002/hr (~$1.49/mo)

Starting at $0.3/hr (~$223/pool mo). 

Each additional eDTU $0.003/hr (~$2.23/mo)

Starting at $0.937/hr (`$697/pool/mo).

Each additional eDTU $0.0075/hr (~$5.58/mo)

Storage per eDTU

0.1 GB per eDTU

1 GB per eDTU

.5 GB per eDTU

Elastic Databases

eDTU max per database (preview limits)

0-5

0-100

0-1000

Storage max per DB

2 GB

250 GB

500 GB

Per DB cost (preview prices)

$0.0003/hr (~$0.22/mo)

$0.0017/hr (~$1.26/mo)

$0.0084/hr (~$6.25/mo)

We’ll continue to iterate on the above parameters and increase the maximum number of databases per pool as we progress through the preview, and would love your feedback as we do so.

New Higher-Scale SQL Database Performance Tiers

In addition to the enhancements for Elastic Database Pools, we are also today releasing new SQL Database Premium performance tier options for standalone databases. 

Today we are adding a new P4 (500 DTU) and a P11 (1750 DTU) level which provide even higher performance database options for SQL Databases that want to scale-up. The new P11 edition also now supports databases up to 1TB in size.

Developers can now choose from 10 different SQL Database Performance levels.  You can easily scale-up/scale-down as needed at any point without database downtime or interruption.  Each database performance tier supports a 99.99% SLA, multiple-high availability replica support built-in with no extra charges (meaning you don’t need to buy multiple instances to get an SLA – this is built-into each database), no down-time during patching, point-in-time recovery options (restore without needing a backup), TDE encryption of data, row-level security, and full-text search.

image

Learn More

You can learn more about SQL Databases by visiting the http://azure.microsoft.com web-site.  Check out the SQL Database product page to learn more about the capabilities SQL Databases provide, as well as read the technical documentation to learn more how to build great applications using it.

Summary

Today’s database updates enable developers to build even better cloud applications, and to use data to make them even richer more intelligent.  We are really looking forward to seeing the solutions you build.

Hope this helps,

Scott

omni
Categories: Blogs