Skip to content

Feed aggregator

Best Practices for .NET Code Coverage

best-practices-webinarWe have been covering code for over a decade. During that time, we have learned a lot from our customers on how to make the most out of implementing meaningful code coverage to teams. We thought it may help if we put together some of the highlights we have learned about the best practices of .NET code coverage and share them with you.

This webinar outlines four categories of best practices, with examples, to guide development efforts and improve overall code quality. The first category focuses on code coverage metrics and the three we rely on in our own organization. The second category outlines how code coverage best practices can be employed as a team. The third category reviews techniques for effectively capturing code coverage data. The fourth, and final category, relates specifically to NCover and how to keep your system optimized.

This webinar outlines four categories of best practices, with examples, to guide development efforts and improve overall code quality. Previously recorded and available for immediate viewing.

//

The post Best Practices for .NET Code Coverage appeared first on NCover.

Categories: Companies

Windows Phone - Package.appxmanifest

C-Sharpcorner - Latest Articles - 8 hours 27 min ago
In this article you will learn Windows Phone “Package.appxmanifest”.
Categories: Communities

Print Receipt and Save Data using Window Form in C#

C-Sharpcorner - Latest Articles - 8 hours 27 min ago
In this article I am going to show how we can make prints of our window form.
Categories: Communities

Azure: Announcing New Real-time Data Streaming and Data Factory Services

ScottGu's Blog - Scott Guthrie - Fri, 10/31/2014 - 07:39

The last three weeks have been busy ones for Azure.  Two weeks ago we announced a partnership with Docker to enable great container-based development experiences on Linux, Windows Server and Microsoft Azure.

Last week we held our Cloud Day event and announced our new G-Series of Virtual Machines as well as Premium Storage offering.  The G-Series VMs provide the largest VM sizes available in the public cloud today (nearly 2x more memory than the largest AWS offering, and 4x more memory than the largest Google offering).  The new Premium Storage offering (which will work with both our D-series and G-series of VMs) will support up to 32TB of storage per VM, >50,000 IOPS of disk IO per VM, and enable sub-1ms read latency.  Combined they provide an enormous amount of power that enables you to run even bigger and better solutions in the cloud.

Earlier this week, we officially opened our new Azure Australia regions – which are our 18th and 19th Azure regions open for business around the world.  Then at TechEd Europe we announced another round of new features – including the launch of the new Azure MarketPlace, a bunch of great network improvements, our new Batch computing service, general availability of our Azure Automation service and more.

Today, I’m excited to blog about even more new services we have released this week in the Azure Data space.  These include:

  • Event Hubs: is a scalable service for ingesting and storing data from websites, client apps, and IoT sensors.
  • Stream Analytics: is a cost-effective event processing engine that helps uncover real-time insights from event streams.
  • Data Factory: enables better information production by orchestrating and managing diverse data and data movement.

Azure Event Hub is now available in general availability, and the new Azure Stream Analytics and Data Factory services are now in public preview. Event Hubs: Log Millions of events per second in near real time

The Azure Event Hub service is a highly scalable telemetry ingestion service that can log millions of events per second in near real time.  You can use the Event Hub service to collect data/events from any IoT device, from any app (web, mobile, or a backend service), or via feeds like social networks.  We are using it internally within Microsoft to monitor some of our largest online systems.

Once you collect events with Event Hub you can then analyze the data using any real-time analytics system (like Apache Storm or our new Azure Stream Analytics service) and store/transform it into any data storage system (including HDInsight and Hadoop based solutions).

Event Hub is delivered as a managed service on Azure (meaning we run, scale and patch it for you and provide an enterprise SLA).  It delivers:

  • Ability to log millions of events per second in near real time
  • Elastic scaling support with the ability to scale-up/down with no interruption
  • Support for multiple protocols including support for HTTP and AMQP based events
  • Flexible authorization and throttling device policies
  • Time-based event buffering with event order preservation

The pricing model for Event Hubs is very flexible – for just $11/month you can provision a basic Event Hub with guaranteed performance capacity to capture 1 MB/sec of events sent to your Event Hub.  You can then provision as many additional capacity units as you need if your event traffic goes higher. 

Getting Started with Capturing Events

You can create a new Event Hub using the Azure Portal or via the command-line.  Choose New->App Service->Service Bus->Event Hub in the portal to do so:

image

Once created, events can be sent to an Event Hub with either a strongly-typed API (e.g. .NET or Java client library) or by just sending a raw HTTP or AMQP message to the service.  Below is a simple example of how easy it is to log an IoT event to an Event Hub using just a standard HTTP post request.  Notice the Authorization header in the HTTP post – you can use this to optionally enable flexible authentication/authorization for your devices:

POST https://your-namespace.servicebus.windows.net/your-event-hub/messages?timeout=60&api-version=2014-01 HTTP/1.1<?xml:namespace prefix = "o" />

Authorization: SharedAccessSignature sr=your-namespace.servicebus.windows.net&sig=tYu8qdH563Pc96Lky0SFs5PhbGnljF7mLYQwCZmk9M0%3d&se=1403736877&skn=RootManageSharedAccessKey

ContentType: application/atom+xml;type=entry;charset=utf-8

Host: your-namespace.servicebus.windows.net

Content-Length: 42

Expect: 100-continue

 

{ "DeviceId":"dev-01", "Temperature":"37.0" }

Your Event Hub can collect up to millions of messages per second like this, each storing whatever data schema you want within them, and the Event Hubs service will store them in-order for you to later read/consume.

Downstream Event Processing

Once you collect events, you no doubt want to do something with them.  Event Hubs includes an intelligent processing agent that allows for automatic partition management and load distribution across readers.  You can implement any logic you want within readers, and the data sent to the readers is delivered in the order it was sent to the Event Hub.

In addition to supporting the ability for you to write custom Event Readers, we also have two easy ways to work with pre-built stream processing systems: including our new Azure Stream Analytics Service and Apache Storm.  Our new Azure Stream Analytics service supports doing stream processing directly from Event Hubs, and Microsoft has created an Event Hubs Storm Spout for use with Apache Storm clusters.

The below diagram helps express some of the many rich ways you can use Event Hubs to collect and then hand-off events/data for processing:

image

Event Hubs provides a super flexible and cost effective building-block that you can use to collect and process any events or data you can stream to the cloud.  It is very cost effective, and provides the scalability you need to meet any needs.

Learning More about Event Hubs

For more information about Azure Event Hubs, please review the following resources:

Stream Analytics: Distributed Stream Processing Service for Azure

I’m excited to announce the preview our new Azure Stream Analytics service – a fully managed real-time distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. The new Azure Stream Analytics service easily scales from small projects with just a few KB/sec of throughput to a gigabyte/sec or more of streamed data messages/events.  

Our Stream Analytics pricing model enable you to run low throughput streaming workloads continuously at low cost, and enables you to only have to scale up as your business needs increase.  We do this while maintaining built in guarantees of event delivery, and state management for fast recovery which enables mission critical business continuity.

Dramatically Simpler Developer Experience for Stream Processing Data

Stream Analytics supports a SQL-like language that dramatically lowers the bar of the developer expertise required to create a scalable stream processing solution. A developer can simply write a few lines of SQL to do common operations including basic filtering, temporal analysis operations, joining multiple live streams of data with other static data sources, and detecting stream patterns (or lack thereof).

This dramatically reduces the complexity and time it takes to develop, maintain and apply time-sensitive computations on real-time streams of data. Most other streaming solutions available today require you to write complex custom code, but with Azure Stream Analytics you can write simple, declarative and familiar SQL.

Fully Managed Service that is Easy to Setup

With Stream Analytics you can dramatically accelerate how quickly you can derive valuable real time insights and analytics on data from devices, sensors, infrastructure, or applications. With a few clicks in the Azure Portal, you can create a streaming pipeline, configure its inputs and outputs, and provide SQL-like queries to describe the desired stream transformations/analysis you wish to do on the data. Once running, you are able to monitor the scale/speed of your overall streaming pipeline and make adjustments to achieve the desired throughput and latency.

You can create a new Stream Analytics Job in the Azure Portal, by choosing New->Data Services->Stream Analytics:

image

Setup Streaming Data Input

Once created, your first step will be to add a Streaming Data Input.  This allows you to indicate where the data you want to perform stream processing on is coming from.  From within the portal you can choose Inputs->Add An Input to launch a wizard that enables you to specify this:

image

We can use the Azure Event Hub Service to deliver us a stream of data to perform processing on. If you already have an Event Hub created, you can choose it from a list populated in the wizard above.  You will also be asked to specify the format that is being used to serialize incoming event in the Event Hub (e.g. JSON, CSV or Avro formats).

Setup Output Location

The next step to developing our Stream Analytics job is to add a Streaming Output Location.  This will configure where we want the output results of our stream processing pipeline to go.  We can choose to easily output the results to Blob Storage, another Event Hub, or a SQL Database:

image

Note that being able to use another Event Hub as a target provides a powerful way to connect multiple streams into an overall pipeline with multiple steps.

Write Streaming Queries

Now that we have our input and output sources configured, we can now write SQL queries to transform, aggregate and/or correlate the incoming input (or set of inputs in the event of multiple input sources) and output them to our output target.  We can do this within the portal by selecting the QUERY tab at the top.

image

There are a number of interesting queries you can write to processing the incoming stream of data.  For example, in the previous Event Hub section of this blog post I showed how you can use an HTTP POST command to submit JSON based temperature data from an IoT device to an Event Hub with data in JSON format like so:

{ "DeviceId":"dev-01", "Temperature":"37.0" }

When multiple devices are streaming events simultaneously into our Event Hub like this, it would feed into our Stream Analytics job as a stream of continuous data events that look like the sequence below:

Wouldn’t it be interesting to be able to analyze this data using a time-window perspective instead?  For example, it would be useful to calculate in real-time what the average temperature of each device was in the last 5 seconds of multiple readings.

With the Stream Analytics Service we can now dynamically calculate this over our incoming live stream of data just by writing a SQL query like so:

SELECT DateAdd(second,-5,System.TimeStamp) as WinStartTime, system.TimeStamp as WinEndTime, DeviceId, Avg(Temperature) as AvgTemperature, Count(*) as EventCount 
    FROM input
    GROUP BY TumblingWindow(second, 5), DeviceId

Running this query in our Stream Analytics job will aggregate/transform our incoming stream of data events and output data like below into the output source we configured for our job (e,g, a blog storage file or a SQL Database):

The great thing about this approach is that the data is being aggregated/transformed in real time as events are being streamed to us, and it scales to handle literally gigabytes of data event streamed per second.

Scaling your Stream Analytics Job

Once defined, you can easily monitor the activity of your Stream Analytics Jobs in the Azure Portal:

image

You can use the SCALE tab to dynamically increase or decrease scale capacity for your stream processing – allowing you to pay only for the compute capacity you need, and enabling you to handle jobs with gigabytes/sec of streamed data. 

Learning More about Stream Analytics Service

For more information about Stream Analytics, please review the following resources:

Data Factory: Fully managed service to build and manage information production pipelines

Organizations are increasingly looking to fully leverage all of the data available to their business.  As they do so, the data processing landscape is becoming more diverse than ever before – data is being processed across geographic locations, on-premises and cloud, across a wide variety of data types and sources (SQL, NoSQL, Hadoop, etc), and the volume of data needing to be processed is increasing exponentially. Developers today are often left writing large amounts of custom logic to deliver an information production system that can manage and co-ordinate all of this data and processing work.

To help make this process simpler, I’m excited to announce the preview of our new Azure Data Factory service – a fully managed service that makes it easy to compose data storage, processing, and data movement services into streamlined, scalable & reliable data production pipelines. Once a pipeline is deployed, Data Factory enables easy monitoring and management of it, greatly reducing operational costs. 

Easy to Get Started

The Azure Data Factory is a fully managed service. Getting started with Data Factory is simple. With a few clicks in the Azure preview portal, or via our command line operations, a developer can create a new data factory and link it to data and processing resources.  From the new Azure Marketplace in the Azure Preview Portal, choose Data + Analytics –> Data Factory to create a new instance in Azure:

image

Orchestrating Information Production Pipelines across multiple data sources

Data Factory makes it easy to coordinate and manage data sources from a variety of locations – including ones both in the cloud and on-premises.  Support for working with data on-premises inside SQL Server, as well as Azure Blob, Tables, HDInsight Hadoop systems and SQL Databases is included in this week’s preview release. 

Access to on-premises data is supported through a data management gateway that allows for easy configuration and management of secure connections to your on-premises SQL Servers.  Data Factory balances the scale & agility provided by the cloud, Hadoop and non-relational platforms, with the management & monitoring that enterprise systems require to enable information production in a hybrid environment.

Custom Data Processing Activities using Hive, Pig and C#

This week’s preview enables data processing using Hive, Pig and custom C# code activities.  Data Factory activities can be used to clean data, anonymize/mask critical data fields, and transform the data in a wide variety of complex ways.

The Hive and Pig activities can be run on an HDInsight cluster you create, or alternatively you can allow Data Factory to fully manage the Hadoop cluster lifecycle on your behalf.  Simply author your activities, combine them into a pipeline, set an execution schedule and you’re done – no manual Hadoop cluster setup or management required. 

Built-in Information Production Monitoring and Dashboarding

Data Factory also offers an up-to-the moment monitoring dashboard, which means you can deploy your data pipelines and immediately begin to view them as part of your monitoring dashboard.  Once you have created and deployed pipelines to your Data Factory you can quickly assess end-to-end data pipeline health, pinpoint issues, and take corrective action as needed.

Within the Azure Preview Portal, you get a visual layout of all of your pipelines and data inputs and outputs. You can see all the relationships and dependencies of your data pipelines across all of your sources so you always know where data is coming from and where it is going at a glance. We also provide you with a historical accounting of job execution, data production status, and system health in a single monitoring dashboard:

image

Learning More about Stream Analytics Service

For more information about Data Factory, please review the following resources:

Other Great Data Improvements

Today’s releases make it even easier for customers to stream, process and manage the movement of data in the cloud.  Over the last few months we’ve released a bunch of other great data updates as well that make Azure a great platform to perform any data needs.  Since August: 

We released a major update of our SQL Database service, which is a relational database as a service offering.  The new SQL DB editions (Basic/Standard/Premium ) support a 99.99% SLA, larger database sizes, dedicated performance guarantees, point-in-time recovery, new auditing features, and the ability to easily setup active geo-DR support. 

We released a preview of our new DocumentDB service, which is a fully-managed, highly-scalable, NoSQL Document Database service that supports saving and querying JSON based data.  It enables you to linearly scale your document store and scale to any application size.  Microsoft MSN portal recently was rewritten to use it – and stores more than 20TB of data within it.

We released our new Redis Cache service, which is a secure/dedicated Redis cache offering, managed as a service by Microsoft.  Redis is a popular open-source solution that enables high-performance data types, and our Redis Cache service enables you to standup an in-memory cache that can make the performance of any application much faster.

We released major updates to our HDInsight Hadoop service, which is a 100% Apache Hadoop-based service in the cloud. We have also added built-in support for using two popular frameworks in the Hadoop ecosystem: Apache HBase and Apache Storm.

We released a preview of our new Search-As-A-Service offering, which provides a managed search offering based on ElasticSearch that you can easily integrate into any Web or Mobile Application.  It enables you to build search experiences over any data your application uses (including data in SQLDB, DocDB, Hadoop and more).

And we have released a preview of our Machine Learning service, which provides a powerful cloud-based predictive analytics service.  It is designed for both new and experienced data scientists, includes 100s of algorithms from both the open source world and Microsoft Research, and supports writing ML solutions using the popular R open-source language.

You’ll continue to see major data improvements in the months ahead – we have an exciting roadmap of improvements ahead.

Summary

Today’s Microsoft Azure release enables some great new data scenarios, and makes building applications that work with data in the cloud even easier.

If you don’t already have a Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Microsoft Azure Developer Center to learn more about how to build apps with it.

Hope this helps,

Scott

P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu omni

Categories: Blogs

MVP Showcase at the Microsoft MVP Summit 2014 (Think "Wow, I wanna talk to... and... and... and... and..." )

The Microsoft MVP Award Program Blog - MVP Showcase at the MVP Summit

What has quickly become a favorite among attendees at the MVP Global Summit is gearing up for its third installment.  The MVP Showcase, happening this Sunday from 16:00-19:00 at the Hyatt in Bellevue, gives MVPs the opportunity to share their current projects, network and discuss new technological trends.

"The MVP Showcase adds an additional layer of networking, collaboration and fun to the MVP Summit," says Community Program Manager and MVP Showcase organizer, Kari Finn.

MVP Presenters (in alphabetical order):

image

image..."

THIS is one of the best things about being a MS MVP, being able to chat and connect with all these MVP's (among so many of the other MVPs.... it's actually pretty darn humbling) and the connection to the Product Teams/Groups. I mean, it's like... wow...

 

Related Past Post XRef:
Guess who's a newly minted Microsoft MVP?
Microsoft MVP Award Unboxing

[Bucket List item checked off] I'm the DZone Featured MVP of the week! Want to be a Microsoft MVP? Here's a couple what, where and how's...
Aidan's "Beginners Guide To The MVP Summit"
Hey Microsoft MVP’s, pluralsight has got a great deal for you! A complimentary 1-Year Standard subscription to the entire Pluralsight On-Demand training library
Categories: Blogs

A glimpse at how Infragistics uses a C# to JavaScript transcompiler, powered by "Roslyn" (.NET Compiler Platform)

Infragistics - Mike Dour's Blog - Client-Side Excel Library CTP

If you haven’t seen it already, we recently released a 100% JavaScript-only, client-side Excel library for Ignite UI and I’m super excited about it. It allows you to read, write, and manipulate Excel workbooks. You can even create and solve formulas, all from inside the browser!! It was released in 14.2 as a CTP so we could get your feedback on it, but we will be releasing a complete RTM version in 15.1. You can find information and a live sample of it here. Definitely check out the overview page, which is packed with important information for using this library.

But that’s not even the coolest part. Not only did we deliver a purely JavaScript library for Excel workbooks, but it has all the features of our existing .NET Excel libraries. Did we re-write the entire C# Excel library in JavaScript to provide this level of feature parity? We could have, but it would have taken a lot of effort getting there not to mention the ongoing challenge of maintaining feature parity between the versions and addressing bugs in both implementations. So we came up with something better. We built a C# to JavaScript source-to-source compiler, or transcompiler. We have actually been using this for a few releases now to deliver some of the Ignite UI controls, but it was missing support for some constructs being used in the C# Excel library. So we really beefed up its language support as well as changed its semantic analysis engine. Now based on Microsoft’s .NET Compiler Platform ("Roslyn") for C# semantic analysis, our transcompiler is able to read in our existing C# Excel library and generate semantically equivalent JavaScript code. There are still a few rough edges to smooth out, but we are currently addressing these issues to deliver the highest quality Excel library we can in the next release.

Unfortunately, one of those rough edges was in documentation. ...

...

So hopefully this can help you get started with the Client-Side Excel library preview. There are a few things that don’t work properly yet (such as loading files with dates), but what we have provided should give you a good sense of what’s to come in 15.1. Please let us know what you think and if there are any pain points with the API or ways you think we can do better to make this library as easy as possible to use. Let us know at igniteui@infragistics.com. We look forward to your feedback. Thanks!

While you guys know I have something of a fanboy crush on Infragistics (come on, I've been using their stuff, in many forms since its VBX days... ;) that's not why I'm blogging about this. What I wanted to highlight is how they are using .NET Compiler Platform ("Roslyn") as their transcompiler to take the C# and generate JavaScript...

"...We built a C# to JavaScript source-to-source compiler, or transcompiler. We have actually been using this for a few releases now to deliver some of the Ignite UI controls, but it was missing support for some constructs being used in the C# Excel library. So we really beefed up its language support as well as changed its semantic analysis engine. Now based on Microsoft’s .NET Compiler Platform ("Roslyn") for C# semantic analysis, our transcompiler is able to read in our existing C# Excel library and generate semantically equivalent JavaScript code. ..."

That's just cool. And something I wonder if they will productize? (If so, that wouldn't be cheap as I bet that's some serious IP). Still the fact they even share that this is some of their secret sauce is nice (see, I'm not a fanboy for just any reason.... ;)

Categories: Blogs

The Brent Ozar New All-in-One Download Pack (and discount on their commercial stuff secret)

Brent Ozar - Announcing Our New All-in-One Download Pack

We give away a lot of stuff – scripts, setup checklists, e-books, posters, you name it.

But we kept hearing a theme from folks: “Wow, I’ve seen one of your tools before, but I had no idea there were so many others!” In order to get everything, they had to go all over the place in our site.

To fix that, we’ve got a new easy button: our free SQL Server download pack. Now when you get anything, you’ll get everything in a single zip file, plus get email notifications whenever there’s a new version.

Enjoy, and hope we make your job suck just a little less.

image

[GD: POST Leached in FULL... Come on, it was just a few sentences!]

All of the awesome free Brent Ozar and team downloads in one? Yeah, that's cool.

The discount secret? You have to subscribe to their RSS Feed to see... :)

Categories: Blogs

What if Windows 3.1 had a baby with Windows 95? Windows 93 (and you can play with in your browser...)

OSNews - Try Windows 93 Today

What if Microsoft released an operating system in the chasm between Windows 3.1 and Windows 95? It might look something like Windows 93, an interactive art project by Jankenpopp and Zombectro that you can try right in your browser.

Try Windows 93: The Hilarious OS That Never Was

If you didn’t live through the jump, it can be hard to describe the cultural revolution between Windows 3.1 and Windows 95. Its taskbar ussured in an era of “multitasking”; its built-in web browser put the world’s information at your fingertips; its “Start” menu, complete with its own ~$10 million Rolling Stones song, was pure optimism rendered in bits.

But what if Microsoft released an operating system in the chasm between Windows 3.1 and Windows 95? It might look something like Windows 93, an interactive art project by Jankenpopp and Zombectro that you can try right in your browser.

The experience of the OS is hard to put into words--it’s Windows imagined in some parallel universe, with plenty of retro homages to the weird OS quirks of yore.

...

It’s surprising just how deep you can dig in Windows 93, thanks to content like GameBoy emulators and pixel editors that have actually been pulled from various sources across the web. I spent a shameful amount of time giggling nostalgically, until suddenly, a beach ball of death showed up on my screen. At first, I figured it was just another one of Windows 93’s jokes until, moments later, Chrome froze and then crashed.

...

image..."

http://www.windows93.net/

image

What is very ironic is that, for me at least, the site seems to work better in Chrome. :/

Categories: Blogs

TypeScript Demos from My TechEd Europe Session

Gil Fink on .Net - Thu, 10/30/2014 - 22:22

TypeScript Demos from TechEd Europe SessionI had the pleasure of delivering a TypeScript session today in TechEd Europe.
In the session, I introduced the TypeScript language and talked about language features and how to use them.
Later on, I wrote a simple end-to-end web application using TypeScript in the server (with Node.js and Express) and in the front-end (mainly with jQuery).
I want to thank all the session attendees!

You can find the demos online.

Categories: Blogs

The Ins and Outs of Azure Stream Analytics – Real-Time Event Processing

Yesterday at TechEd Europe 2014, Microsoft announced the preview of Azure Stream Analytics. This post will give you the ins and outs of this new service.

What is Azure Stream Analytics?

Azure Stream Analytics is a cost effective event processing engine that helps uncover real-time insights from devices, sensors, infrastructure, applications, and data. Deployed in the Azure cloud, Stream Analytics has elastic scale where resources are efficiently allocated and paid for as requested. Developers are given a rapid development experience where they describe their desired transformations in SQL-like syntax. Some unique aspects about Stream Analytics are:

  • Low cost: Stream Analytics is architected for multi-tenancy meaning you only pay for what you use and not for idle resources.  Unlike other solutions, small streaming jobs will be cost effective.
  • Faster developer productivity: Stream Analytics allow developers to use a SQL-like syntax that can speed up development time from thousands of lines of code down to a few lines.  The system will abstract the complexities of the parallelization, distributed computing, and error handling away from the developers.
  • Elasticity of the cloud: Stream Analytics is built as a managed service in Azure.  This means customers can spin up or down any number of resources on demand.  Customers will not have to setup costly hardware or install and maintain software.

Similar to the recent announcement Microsoft made in making Apache Storm available in Azure HDInsight, Stream Analytics is a stream processing engine that is integrated with a scalable event queuing system like Azure Event Hubs. By making both Storm and Stream Analytics available, Microsoft is giving customers options to deploy their real-time event processing engine of choice.

What can it do?

Stream Analytics will enable various scenarios including Internet of Things (IoT) such as real-time fleet management or gaining insights from devices like mobile phones and connected cars. Specific scenarios that customers are doing with real-time event processing include:

  • Real-time ingestion, processing and archiving of data: Customers will use Stream Analytics to ingest a continuous stream of data and do in-flight processing like scrubbing PII information, adding geo-tagging, and doing IP lookups before being sent to a data store.
  • Real-time Analytics: Customers will use Stream Analytics to provide real-time dashboarding where customers can see trends that happen immediately when they occur.
  • Connected devices (Internet of Things): Customers will use Stream Analytics to get real-time information from their connected devices like machines, buildings, or cars so that relevant action can be done. This can include scheduling a repair technician, pushing down software updates or to perform a specific automated action.

How do I get started?

For Microsoft customers, we are offering Azure Stream Analytics as a public preview.  To get started, customers will need to have an Azure subscription or a free trial to Azure. With this in hand, you should be able to get Azure Stream Analytics up and running in minutes. Start by reading this getting started guide.

For more information on Azure Stream Analytics:

Categories: Companies

The Ins and Outs of Azure Data Factory – Orchestration and Management of Diverse Data

Yesterday at TechEd Europe 2014, Microsoft announced the preview of Azure Data Factory. This post will give you the ins and outs of this new service.

What is Azure Data Factory?

Azure Data Factory is a fully managed service that does information production by orchestrating data with processing services as managed data pipelines. A pipeline connects diverse data (like SQL Server on-premises or cloud data like Azure SQL Database, Blobs, Tables, and SQL Server in Azure Virtual Machines) with diverse processing techniques (like Azure HDInsight (Hive and Pig), and custom C# activities).  This will allow the data developer to transform and shape the data (join, aggregate, cleanse, enrich) so that it becomes authoritative and trustworthy to be consumed by BI tools. These pipelines are all managed within a single pane of glass where rich health and lineage is available to diagnose issues or do impact analysis across all data and processing assets. Some unique points about Data Factory are:

  • Ability to process data from diverse locations and data types.  Data Factory can pull data from relational, on-premises sources like SQL Server and join with non-relational, cloud sources like Azure Blobs.
  • Provide a holistic view of the entire IT infrastructure that includes both commercial and open source together. Data Factory can orchestrate Hive and Pig using Hadoop while also bringing in commercial products and services like SQL Server and Azure SQL Database in a single view.

What can it do?

With the ability to manage and orchestrate the collection, movement and transformation of semi-structured and structured data together, Data Factory provides customers with a central place to manage their processing of web log analytics, click stream analysis, social sentiment, sensor data analysis, geo-location analysis, and more. In public preview, Microsoft views Data Factory as a key tool for customers who are looking to have a hybrid story with SQL Server or who currently use Azure HDInsight, Azure SQL Database, Azure Blobs, and Power BI for Office 365. In the future, we’ll bring more data sources and processing capabilities to the Data Factory.

How do I get started?

For Microsoft customers, we are offering Azure Data Factory as a public preview.  To get started, customers will need to have an Azure subscription or a free trial to Azure. With this in hand, you should be able to get Azure Data Factory up and running in minutes. Start by reading this getting started guide.

For more information on Azure Data Factory:

Categories: Companies

Lazy Initialization in .NET

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
In this article, we will learn the basics of the Lazy Initialization in .NET and how to implement in our applications.
Categories: Communities

View to Controller Method 9: Day 31

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
In this article we post data from a view to a controller using a form and get the data using Request.Params.
Categories: Communities

Windows Phone - XAML Styling

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
In this article you will learn XAML styling in Windows Phone.
Categories: Communities

View to Controller Method 8: Day 30

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
In this article we post data from a view to a controller using a form and get the data using Request.Form.Get.
Categories: Communities

Windows Phone - Command Bar

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
Today in this article I'll talk about a simple but useful topic, the “Windows Phone Command Bar”.
Categories: Communities

Remote Attribute Sample in MVC: Day 29

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
In this article we will see how to apply a remote attribute in MVC.
Categories: Communities

Managed Metadata Service Fix in SharePoint

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
In this article you will learn how to fix a problem with the Managed Metadata Service or Connection in SharePoint.
Categories: Communities

Mastering WPF DataGrid in a Day: Hour 6 DataGrid Columns In-Depth

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
In my last article, I demonstrated how to use some of the basic columns functionality. In this Hour 6 part of Mastering WPF DataGrid in a Day series, I will cover some of the advanced functionality of DataGrid columns.
Categories: Communities

C# Interactive Compiler Development

C-Sharpcorner - Latest Articles - Thu, 10/30/2014 - 08:00
This article explains the complete life cycle of making a custom interactive C# compiler much like the existing CSC.exe.
Categories: Communities