The annoying thing about software in production is that it is a black box. It just sits there, doing something, and you have very little input into what. Oh, you can look at the CPU usage and memory consumption, you can try to figure out what is going on from the kind of things that the system will tell you this process is doing. But for the most part ,this is a black box. And not even one that is designed to let you figure out what just happened.
With RavenDB, we have made a very conscious effort to avoid being a black box. There are a lot of end points that you can query to figure out exactly what is going on. And you can use different endpoints to figure out different problems. But in the end, while that was very easy for us to use, those arenâ€™t really meant for end users. They are meant for our support engineers, mostly.
We got tired of sending over â€śgive me the output of the following endpointsâ€ť deal. We wanted a better story, something that would be easier and more convenient all around .So we sat down and thought about this, and came up with the idea of the Debug Info Package.
This deceptively simple tool will capture all of the relevant information from RavenDB into a single zip file that you can mail support. It will also give you a lot of details about the internals of RavenDB at the moment this was produced:
- Recent HTTP requests
- Recent logs
- The database configuration
- What is currently being indexed?
- What are the current queries?
- What tasks are being run?
- All the database metrics
- Current status of the pre-fetch queue
- The database live stats
And if that wasnâ€™t enough, we have the following feature as well:
We get the full stack of the currently running process!
You can see how this look in full in the here:
But the idea is that we have cracked open the black box, and it is now so much easier to figure out what is going on!
PASS VP of Marketing Denise McInerney – a SQL Server MVP and Data Engineer at Intuit – began her career as a SQL Server DBA in 1998 and attended her first PASS Summit in 2002. The SQL Server Team caught up with her ahead of this year’s event, returning to Seattle, WA, Nov. 4-7, to see what she’s looking forward to at the world’s largest conference for SQL Server and BI professionals.
For those who’ve never attended or who’ve been away for a while, what is PASS Summit?
PASS Summit is the world’s largest gathering of Microsoft SQL Server and BI professionals. Organized by and for the community, PASS Summit delivers the most technical sessions, the largest number of attendees, the best networking, and the highest-rated sessions and speakers of any SQL Server event.
We like to think of PASS Summit as the annual reunion for the #sqlfamily. With over 200 technical sessions and 70+ hours of networking opportunities with MVPs, experts and peers, it’s 3 focused days of SQL Server. You can take hands-on workshops, attend Chalk Talks with the experts, and get the answers you need right away at the SQL Server Clinic, staffed by the Microsoft CSS and SQLCAT experts who build and support the features you use every day. Plus, you can join us early for 2 days of pre-conference sessions with top industry experts and explore the whole range of SQL Server solutions and services under one roof in the PASS Summit Exhibit Hall.
Nowhere else will you find over 5,000 passionate SQL Server and BI professionals from 50+ countries and 2,000 different companies connecting, sharing, and learning how to take their SQL Server skills to the next level.
What’s on tap this year as far as sessions?
We’ve announced a record 160+ incredible community sessions across 5 topic tracks: Application and Database Development, BI Information Delivery, BI Platform Architecture, Development and Administration; Enterprise Database Administration and Deployment, and Professional Development. And watch for over 60 sessions from Microsoft’s top experts to be added to the lineup in early September.
You can search by speaker, track, session skill level, or session type – from 10-minute Lightning Talks, to 75-minute General Sessions, to 3-hour Half-Day Sessions and our full-day pre-conference workshops.
And with this year’s new Learning Paths, we’ve made it even easier to find the sessions you’re most interested in. Just use our 9 Learning Path filters to slice and dice the lineup by everything from Beginner sessions to Big Data, Cloud, Hardware Virtualization, and Power BI sessions to SQL Server 2014, High Availability/Disaster Recovery, Performance, and Security sessions.
Networking is at the heart of PASS Summit – what opportunities do you have for attendees to connect with each other?
PASS Summit is all about meeting and talking with people, sharing issues and solutions, and gaining knowledge that will make you a better SQL Server professional. Breakfasts, lunches, and evening receptions are all included and are designed to offer dedicated networking opportunities. And don't underestimate the value of hallway chats and the ability to talk to speakers after their sessions, during lunches and breaks, and at the networking events.
We have special networking activities for first-time attendees, for people interested in the same technical topics at our Birds of a Feather luncheon, and at our popular annual Women in Technology luncheon, which connects 600+ attendees interested in advancing role of women in STEM fields. Plus, our Community Zone is THE place to hang out with fellow attendees and community leaders and learn how to stay involved year-round.
You mentioned the networking events for first-time attendees. With everything going on at Summit, how can new attendees get the most out of their experience?
Our First-Timers Program takes the hard work out of conference prep and is designed specifically to help new attendees make the most of their time at Summit. We connect first-timers with conference alumni, take them inside the week with community webinars, help them sharpen their networking skills through fun onsite workshops, and share inside advice during our First Timers orientation meeting.
In addition, in our “Get to Know Your Community Sessions,” longtime PASS members share how to get involved with PASS and the worldwide #sqlfamily, including encouraging those new to PASS to connect with their local SQL Server communities through PASS Chapters and continue their learning through Virtual Chapters, SQLSaturdays, and other free channels.
How can you learn more about sessions and the overall PASS Summit experience?
A great way to get a taste of Summit is by watching PASS Summit 2013 sessions, interviews, and more on PASStv. You can also check out the best of last year’s Community blogs.
Plus, stay tuned for 24 Hours of PASS: Summit Preview Edition on September 9 to get a free sneak peek at some of the top sessions and speakers coming to PASS Summit this year. Make sure you follow us on Twitter at @PASS24HOP / #pass24hop for the latest updates on these 24 back-to-back webinars.
Where can you register for PASS Summit?
To register, just go to Register Now – and remember to take advantage of the $150 discount code from your local or Virtual PASS Chapter. We also have a great group discount for companies sending 5 or more employees. And don’t forget to purchase the session recordings for year-round learning on all aspects of SQL Server.
Once you get a taste for the learning and networking waiting for you at PASS Summit, we invite you to join the conversation by following us on Twitter (watch the #sqlpass #summit 14 hashtags) and joining our Facebook and LinkedIn groups. We’re looking forward to an amazing, record-breaking event, and can’t wait to see everyone there!
Please stay tuned for regular updates and highlights on Microsoft and PASS activities planned for this year’s conference.
What I like a lot is that everything related to the device-side of my project (a thermometer thing that posts data to the Internet), is in one place. The project system ensures the IDE can be intelligent about code completion and navigation, I can see the npm modules I have installed, I can use version control and directly push my changes back to a GitHub repository. The Terminal tool window lets me run the Tessel command line to run scripts and so on. No fiddling with additional tools so far!Tessel Command Line Tools
As I explained in a previous blog post, the Tessel comes with a command line that is used toconnect the thing to WiFi, run and deploy scripts and read logs off it (and more). I admit it: I am bad at command line things. After a long time, commands get engraved in my memory and Iâ€™m quite fast at using them, but new command line tools, like Tesselâ€™s, are something that I always struggle with at the start.
To help me learn, I thought Iâ€™d add the Tessel command line to WebStormâ€™s Command Line Tools. Through the Project Settings | Command Line Tool Support,, I added the path to Tesselâ€™s tool (%APPDATA%\npm\tessel.cmd). Note that you may have to install the Command Line Tools Plugin into WebStorm, Iâ€™m unsure if itâ€™s bundled.
This helps in getting the Tessel commands available in the Command Line Tools after pressign Ctrl+Shift+X (or Tools | Run Commandâ€¦), but it still does not help me in learning this new commandâ€™s syntax, right? Copy this gist into C:\Users\<your username>\.WebStorm8\config\commandlinetools\Custom_tessel.xml and behold: completion for these commands!
Again, I consider them as training wheels until I start memorizing the commands. I can remember tessel run, but itâ€™s all the oneâ€™s that Iâ€™m not using cntinuously that I tend to forgetâ€¦Running Code on the Tessel
Running code on the Tessel can be done using the tessel run <script.js> command. However, I dislike having to always jump into a console or even the command line tools mentioned earlier to just run and see if things work. WebStorm has the concept of Run/Debug Configurations, where using a simple keystroke (Shift+F10) I can run the active configuration without having to switch my brain context to a console.
I created two configurations: one that runs nodejs on my own computer so I can test some things, and one that invokes tessel run. Provide the path to node, set the working directory to the current project folder, specify the Tessel command line script as the file to execute and provide run somescript.js as the parameters.
Quick note here: after a few massive errors coming from Tesselâ€™s command line tool that mentioned the device only supports one connection, itâ€™s bes tto check the Single instance only box for the run configuration. This ensures the process is killed and restarted whenever the script is ran.
Save, Shift+F10 and weâ€™re deploying and running whenever we want to test our code.
Debugging does not work, as the Tessel does not support this. I hope support for it will be added, ideally using the V8 debugger so WebStorm can hook into it, too. Currently Iâ€™m doing â€śpoor manâ€™s debuggingâ€ť: dumping variables using console.log() mostlyâ€¦External Tools
When I first added Tessel to WebStorm, I figured it would be nice to have some menu entries to invoke commands like updating the firmware (a weekly task,Tessel is being actively developed it seems!) or showing the deviceâ€™s WiFi status. So I did!
External Tools can be added under the IDE Settings | External Tools and added to groups and so on. Hereâ€™s what I entered for the â€śUpdate firmwareâ€ť command:
Itâ€™s basically just running node, passing it the path to the Tessel command line script and appending the correct parameter.
Now, I donâ€™t use my newly created menu too much I must say. Using the command line tools directly is more straightforward. But adding these external tools does give an additional advantage: since I have to re-connect to the WiFi every now and then (Tesselâ€™s WiFi chip is a bit flakey when further away from the access point), I added an external tool for connectingit to WiFi and assigned a shortcut to it (IDE Settings | Keymaps, search for whatever label you gave the command and use the context menu to assign a keyboard shortcut). On my machine, Ctrl+Alt+W resets the Tesselâ€™s WiFi now!Installing npm Packages
This one may be overkill, but I found searching npm for Tessel-related packages quite handy through the IDE. From Project Settings | Node.JS and NPM, searching packages is pretty simple. And installing them, too! Careful, Tesselâ€™s 32 MB of storage may not like too many modulesâ€¦
Fun fact: writing this blog post, I noticed the grunt-tessel package which contains tasks that run or deploy scripts to the device. If you prefer using Grunt for doing that, know WebStorm comes with a Grunt runner, too.
Thatâ€™s it, for now, I do hope to tinker away on the Tessel in the next weeks nad finish my thermometer and the app so I can see the (historical) temperature in my house,
We spend our days (and nights and really anytime we have) developing quality and beautiful .NET applications. We pour over our code, testing and coverage to make sure it is good. Some of us do that on our own while others are part of a larger network of teams with managers, developers and quality assurance members all along the way.
We all want to know that our code is good. We write tests to prove that and use coverage tools to show that we have strong tests. Rolling out a coverage process can feel pretty daunting whether you are in the same building or across the globe.
The team at Google recently gave us all a peek behind their curtain about implementing code coverage team-wide and its effects across the organization. (You can read the whole post here). We spliced it down into the Cliff’s Notes version with some key takeaways.
Their teamâ€™s mission was to collect coverage related data and then develop and implement the code coverage practices company wide. To make it easy, they designed an opt-in system where engineers could enable two different types of coverage measurements for their projects: daily and per-commit. With daily coverage, Google ran all tests for their project, where as with per-commit coverage they ran only the tests affected by the commit. The two measurements are independent and many projects opted into both.
The feedback from Google engineers was overwhelmingly positive. The most loved feature they noted was surfacing the coverage information during code review time. This early surfacing of coverage had a statistically significant impact: their initial analysis suggests that it increased coverage by 10% (averaged across all commits).
Their process is ever-changing and growing. We will keep you posted on their activity along the way.
It took a while, but it is here. The most requested feature on the Azure Store is here:
This is currently only available on the East US region. That is going to change, but it will take a bit of time. You can vote on which regions you want RavenHQ on Azure to expand to.
RavenHQ on Azure can be used in one of two ways. You can purchase it via the Azure Marketplace, in which case you have to deal only with a single invoice, and you can manage everything through the Azure site. However, the Azure Marketplace doesnâ€™t currently support prorated and tiered billing, which means that the plans that you purchase in the marketplace have hard limits on data. You could also purchase those same plans directly from RavenHQ and take advantage of usage based billing, which allows you to use more storage than whatâ€™s included in the plan at a prorated cost.
RavenHQ is now offering a lower price point for replicated plans, so you donâ€™t have to think twice before jumping into the high availability option.
When writing my last post,Using the OpenXML SDK Productivity Tool to "decompile" Office documents (Turn *X files into the C# OpenXML SDK code that would generate them), I came across this;
I'm like, "What?" No...
Contains content from dev.office.com that is openly editable by the public.Ways to contribute
You can contribute to Office developer documentation in a few different ways:
- Contribute to articles via the public Office developer docs repo*
- Report documentation bugs via GitHub Issues
- Add documentation requests to the Office/SharePoint developer UserVoice
*We're only taking documentation contributions for the OpenXML Conceptual content at this timeRepository organization
The content in the office-content repository is grouped first by article language, then by topic. The README.md file at the root of each topic directory specifies the structure of the articles within the topic.
Article within each topic are named by MSDN GUID rather than title name. This is a side effect of our document management process and cannot be changed at this time. We highly recommend using the table of contents within each topic directory (see links below) to navigate to the files you wish to view or edit.Articles in this repository Open XML Before we can accept your pull request
Now that's cool...