Skip to content

Feed aggregator

SharePoint Add-ins (SharePoint Hosted Add-in) - Part Two

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this multi part series, this is part two. In this article I’ll try to cover SharePoint hosted add-in with example.
Categories: Communities

Showing Message Dialogs When Using MVVM Pattern In Windows 10 UWP APP

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article we are going to see how to show message dialog using MVVM pattern in Windows 10 universal app.
Categories: Communities

Create Virtual Machine Using Azure's New Portal

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article we will see how to create Virtual Machine (VM) using Azure’s New Portal.
Categories: Communities

Splunk Enterprise: The Platform for Operational Intelligence

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article you will learn about Splunk Enterprise which is the platform for Operational Intelligence.
Categories: Communities

Deploying NodeJS On Heroku

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article you will learn how to deploy NodeJS application on Heroku.
Categories: Communities

Voice of a Developer: Browser Runtime - Part Thirty Three

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article you will learn about Browser Runtime in JavaScript. This is part 33 of the article series.
Categories: Communities

Set Up Server Environment For ASP.NET MVC Application On Development Machine

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article you will learn how to set up Server Environment for ASP.NET MVC Application on Development Machine.
Categories: Communities

Working With SharePoint Site Collection Features Using REST API

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article you will learn how to work with site collection features (site scoped) programmatically using REST API.
Categories: Communities

What Makes iOS A Developer Friendly Platform

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article you will see what makes iOS a Developer friendly platform.
Categories: Communities

Client Side Exporting In HighChart

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article we will discuss how we can enable client side exporting in HighChart.
Categories: Communities

Top 10 Most Important Features Of C# Programming

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article, we’ll learn about the most important points of C#, it’s all about C#.
Categories: Communities

Working With Test Client In ASP.NET Web API Help Page

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article we are going to see how we test our API with the help of a package called WebApiTestClient.
Categories: Communities

Active Directory Domain Services Installation On Windows Server 2016 Technical Preview 4

C-Sharpcorner - Latest Articles - 14 hours 59 min ago
In this article you will learn about Active Directory Domain Services Installation on Windows Server 2016 Technical Preview 4.
Categories: Communities

Writing cleaner JavaScript code with gulp and eslint

Decaying Code - Maxime Rouiller - Sat, 05/28/2016 - 01:53

With the new ASP.NET Core 1.0 RC2 right around the corner and it's deep integration with the node.js workflow, I thought about putting out some examples of what I use for my own workflow.

In this scenario, we're going to see how we can improve the JavaScript code that we are writing.

Gulp

This example uses gulp.

I'm not saying that gulp is the best tool for the job. I just find that gulps work really well for our team and you guys should seriously consider it.

Base file

Let's get things started. We'll start off the base template that is shipped with the RC1 template.

The first thing we are going to do is check what is being done and what is missing.

/// <binding Clean='clean' />
"use strict";

var gulp = require("gulp"),
    rimraf = require("rimraf"),
    concat = require("gulp-concat"),
    cssmin = require("gulp-cssmin"),
    uglify = require("gulp-uglify");

var paths = {
    webroot: "./wwwroot/"
};

paths.js = paths.webroot + "js/**/*.js";
paths.minJs = paths.webroot + "js/**/*.min.js";
paths.css = paths.webroot + "css/**/*.css";
paths.minCss = paths.webroot + "css/**/*.min.css";
paths.concatJsDest = paths.webroot + "js/site.min.js";
paths.concatCssDest = paths.webroot + "css/site.min.css";

gulp.task("clean:js", function (cb) {
    rimraf(paths.concatJsDest, cb);
});

gulp.task("clean:css", function (cb) {
    rimraf(paths.concatCssDest, cb);
});

gulp.task("clean", ["clean:js", "clean:css"]);

gulp.task("min:js", function () {
    return gulp.src([paths.js, "!" + paths.minJs], { base: "." })
        .pipe(concat(paths.concatJsDest))
        .pipe(uglify())
        .pipe(gulp.dest("."));
});

gulp.task("min:css", function () {
    return gulp.src([paths.css, "!" + paths.minCss])
        .pipe(concat(paths.concatCssDest))
        .pipe(cssmin())
        .pipe(gulp.dest("."));
});

gulp.task("min", ["min:js", "min:css"]);

As you can see, we basically have 4 tasks and 2 aggregate tasks.

  • Clean JavaScripts files
  • Clean CSS files
  • Minimize Javascript files
  • Minimize CSS files

The aggregate tasks are basically just to do all the cleaning or the minifying at the same time.

Getting more out of it

Well, that brings us to feature equality with what was available with MVC 5 with the Javascript and CSS minifying. However, why not go a step further?

Linting our Javascript

One of the most common thing we need to do is make sure we do not write horrible code. Linting is a code analysis technique that detects early problems or stylistic issues.

How do we get this working with gulp?

First, we install gulp-eslint with npm install gulp-eslint --save-dev run into the web application project folder. This will install the required dependencies and we can start writing some code.

First, let's start by getting the dependency:

var eslint = require('gulp-eslint');

And into your default ASP.NET Core 1.0 project, open up site.js and copy the following code:

function something() {
}

var test = new something();

Let's run the min:js task with gulp like this: gulp min:js. This will show that our file is minimized but... there's something wrong with the style of this code. The something function should be Pascal cased and we want this to be reflected in our code.

Let's integrate the linter in our pipeline.

First let's create our linting task:

gulp.task("lint", function() {
    return gulp.src([paths.js, "!" + paths.minJs], { base: "." })
        .pipe(eslint({
            rules : {
                'new-cap': 1 // function need to begin with a capital letter when newed up
            }
        }))
        .pipe(eslint.format())
        .pipe(eslint.failAfterError());
});

Then, we need to integrate it in our minify task.

gulp.task("min:js" , ["lint"], function () { ... });

Then we can either run gulp lint or gulp min and see the result.

C:_Prototypes\WebApplication1\src\WebApplication1\wwwroot\js\site.js 6:16 warning A constructor name should not start with a lowercase letter new-cap

And that's it! You can pretty much build your own configuration from the available ruleset and have clean javascript part of your build flow!

Many more plugins available

More gulp plugins are available on the registry. Whether you want to lint, transpile javascript (TypeScript, CoffeeScript), compile CSS (Less, SASS), minify images... everything can be included in the pipeline.

Look up the registry and start hacking away!

Categories: Blogs

Creating a simple ASP.NET 5 Markdown TagHelper

Decaying Code - Maxime Rouiller - Sat, 05/28/2016 - 01:53

I've been dabbling a bit with the new ASP.NET 5 TagHelpers and I was wondering how easy it would be to create one.

I've created a simple Markdown TagHelper with the CommonMark implementation.

So let me show you what it is, what each line of code is doing and how to implement it in an ASP.NET MVC 6 application.

The Code
using CommonMark;
using Microsoft.AspNet.Mvc.Rendering;
using Microsoft.AspNet.Razor.Runtime.TagHelpers;

namespace My.TagHelpers
{
    [HtmlTargetElement("markdown")]
    public class MarkdownTagHelper : TagHelper
    {
        public ModelExpression Content { get; set; }
        public override void Process(TagHelperContext context, TagHelperOutput output)
        {
            output.TagMode = TagMode.SelfClosing;
            output.TagName = null;

            var markdown = Content.Model.ToString();
            var html = CommonMarkConverter.Convert(markdown);
            output.Content.SetContentEncoded(html);
        }
    }
}
Inspecting the code

Let's start with the HtmlTargetElementAttribute. This will wire the HTML Tag <markdown></markdown> to be interpreted and processed by this class. There is nothing stop you from actually having more than one target.

You could for example target element <md></md> by just adding [HtmlTargetElement("md")] and it would support both tags without any other changes.

The Content property will allow you to write code like this:

@model MyClass

<markdown content="@ViewData["markdown"]"></markdown>    
<markdown content="Markdown"></markdown>    

This easily allows you to use your model or any server-side code without having to handle data mapping manually.

TagMode.SelfClosing will force the HTML to use self-closing tag rather than having content inside (which we're not going to use anyway). So now we have this:

<markdown content="Markdown" />

All the remaining lines of code are dedicated to making sure that the content we render is actual HTML. output.TagName just make sure that we do not render the actual markdown tag.

And... that's it. Our code is complete.

Activating it

Now you can't just go and create TagHelpers and have them automatically served without wiring one thing.

In your ASP.NET 5 projects, go to /Views/_ViewImports.cshtml.

You should see something like this:

@addTagHelper "*, Microsoft.AspNet.Mvc.TagHelpers"

This will load all TagHelpers from the Microsoft.AspNet.Mvc.TagHelpers assembly.

Just duplicate the line and type-in your assembly name.

Then in your Razor code you can have the code bellow:

public class MyClass
{
    public string Markdown { get; set; }
}
@model MyClass
@{
    ViewData["Title"] = "About";
}
<h2>@ViewData["Title"].</h2>  

<markdown content="Markdown"/>

Which will output your markdown formatted as HTML.

Now whether you load your markdown from files, database or anywhere... you can have your user write rich text in any text box and have your application generate safe HTML.

Components used
Categories: Blogs

Should our front-end websites be server-side at all?

Decaying Code - Maxime Rouiller - Sat, 05/28/2016 - 01:53

I’ve been toying around with projects like Jekyll, Hexo and even some hand-rolled software that will generate me HTML files based on data. The thought that crossed my mind was…

Why do we need dynamically generated HTML again?

Let me take examples and build my case.

Example 1: Blog

Of course the simpler examples like blogs could literally all be static. If you need comments, then you could go with a system like Disqus. This is quite literally one of the only part of your system that is dynamic.

RSS feed? Generated from posts. Posts themselves? Could be automatically generated from a databases or Markdown files periodically. The resulting output can be hosted on a Raspberry Pi without any issues.

Example 2: E-Commerce

This one is more of a problem. Here are the things that don’t change a lot. Products. OK, they may change but do you need to have your site updated right this second? Can it wait a minute? Then all the “product pages” could literally be static pages.

Product reviews? They will need to be “approved” anyway before you want them live. Put them in a servier-side queue, and regenerate the product page with the updated review once it’s done.

There’s 3 things that I see that would require to be dynamic in this scenario.

Search, Checkout and Reviews. Search because as your products scales up, so does your data. Doing the search client side won’t scale at any level. Checkout because we are now handling an actual order and it needs a server components. Reviews because we’ll need to approve and publish them.

In this scenario, only the Search is the actual “Read” component that is now server side. Everything else? Pre-generated. Even if the search is bringing you the list of product dynamically, it can still end up on a static page.

All the other write components? Queued server side to be processed by the business itself with either Azure or an off-site component.

All the backend side of the business (managing products, availability, sales, whatnot, etc.) will need a management UI that will be 100% dynamic (read/write).

Question

So… do we need dynamic front-end with the latest server framework? On the public facing too or just the backend?

If you want to discuss it, Tweet me at @MaximRouiller.

Categories: Blogs

You should not be using WebComponents yet

Decaying Code - Maxime Rouiller - Sat, 05/28/2016 - 01:53

Have you read about WebComponents? It sounds like something that we all tried to achieve on the web since... well... a long time.

If you take a look at the specification, it's hosted on the W3C website. It smell like a real specification. It looks like a real specification.

The only issue is that Web Components is really four specifications. Let's take a look at all four of them.

Reviewing the specificationsHTML Templates

Specification

This specific specification is not part of the "Web components" section. It has been integrated in HTML5. Henceforth, this one is safe.

Custom Elements

Specification

This specification is for review and not for implementation!

Alright no let's not touch this yet.

Shadow DOM

Specification

This specification is for review and not for implementation!

Wow. Okay so this is out of the window too.

HTML Imports

Specification

This one is still a working draft so it hasn't been retired or anything yet. Sounds good!

Getting into more details

So open all of those specifications. Go ahead. I want you to read one section in particular and it's the author/editors section. What do we learn? That those specs were draft, edited and all done by the Google Chrome Team. Except maybe HTML Templates which has Tony Ross (previously PM on the Internet Explorer Team).

What about browser support?

Chrome has all the spec already implemented.

Firefox implemented it but put it behind a flag (about:config, search for properties dom.webcomponents.enabled)

Internet Explorer, they are all Under Consideration

What that tells us

Google is pushing for a standard. Hard. They built the spec, pushing the spec also very hary since all of this is available in Chrome STABLE right now. No other vendors has contributed to the spec itself. Polymer is also a project that is built around WebComponents and it's built by... well the Chrome team.

That tells me that nobody right now should be implementing this in production. If you want to contribute to the spec, fine. But WebComponents are not to be used.

Otherwise, we're only getting in the same issue we were in 10-20 years ago with Internet Explorer and we know it's a painful path.

What is wrong right now with WebComponents

First, it's not cross platform. We handled that in the past. That's not something to stop us.

Second, the current specification is being implemented in Chrome as if it was recommended by the W3C (it is not). Which may lead us to change in the specification which may render your current implementation completely inoperable.

Third, there's no guarantee that the current spec is going to even be accepted by the other browsers. If we get there and Chrome doesn't move, we're back to Internet Explorer 6 era but this time with Chrome.

What should I do?

As for what "Production" is concerned, do not use WebComponents directly. Also, avoid Polymer as it's only a simple wrapper around WebComponents (even with the polyfills).

Use other framework that abstract away the WebComponents part. Frameworks like X-Tag or Brick. That way you can benefit from the feature without learning a specification that may be obsolete very quickly or not implemented at all.

Categories: Blogs

Fix: Error occurred during a cryptographic operation.

Decaying Code - Maxime Rouiller - Sat, 05/28/2016 - 01:53

Have you ever had this error while switching between projects using the Identity authentication?

Are you still wondering what it is and why it happens?

Clear your cookies. The FedAuth cookie is encrypted using the defined machine key in your web.config. If there is none defined in your web.config, it will use a common one. If the key used to encrypt isn't the same used to decrypt?

Boom goes the dynamite.

Categories: Blogs

Renewed MVP ASP.NET/IIS 2015

Decaying Code - Maxime Rouiller - Sat, 05/28/2016 - 01:53

Well there it goes again. It was just confirmed that I am renewed as an MVP for the next 12 months.

Becoming an MVP is not an easy task. Offline conferences, blogs, Twitter, helping manage a user group. All of this is done in my free time and it requires a lot of time.But I'm so glad to be part of the big MVP family once again!

Thanks to all of you who interacted with me last year, let's do it again this year!

Categories: Blogs

Failed to delete web hosting plan Default: Server farm 'Default' cannot be deleted because it has sites assigned to it

Decaying Code - Maxime Rouiller - Sat, 05/28/2016 - 01:53

So I had this issue where I was moving web apps between hosting plans. As they were all transferred, I wondered why it refused to delete them with this error message.

After a few click left and right and a lot of wasted time, I found this blog post that provides a script to help you debug and the exact explanation as to why it doesn't work.

To make things quick, it's all about "Deployment Slots". Among other things, they have their own serverFarm setting and they will not change when you change their parents in Powershell (haven't tried by the portal).

Here's a copy of the script from Harikharan Krishnaraju for future references:

Switch-AzureMode AzureResourceManager
$Resource = Get-AzureResource

foreach ($item in $Resource)
{
	if ($item.ResourceType -Match "Microsoft.Web/sites/slots")
	{
		$plan=(Get-AzureResource -Name $item.Name -ResourceGroupName $item.ResourceGroupName -ResourceType $item.ResourceType -ParentResource $item.ParentResource -ApiVersion 2014-04-01).Properties.webHostingPlan;
		write-host "WebHostingPlan " $plan " under site " $item.ParentResource " for deployment slot " $item.Name ;
	}

	elseif ($item.ResourceType -Match "Microsoft.Web/sites")
	{
		$plan=(Get-AzureResource -Name $item.Name -ResourceGroupName $item.ResourceGroupName -ResourceType $item.ResourceType -ApiVersion 2014-04-01).Properties.webHostingPlan;
		write-host "WebHostingPlan " $plan " under site " $item.Name ;
	}
}
      
    
Categories: Blogs