Blog Feed Post

Automatically minify and combine JavaScript in Visual Studio

As you begin developing more complex client-side functionality, managing the size and shape of your JavaScript includes becomes a key concern. It’s all too easy to accidentally end up with hundreds of kilobytes of JavaScript spread across many separate HTTP requests, significantly slowing down your initial page loads.

To combat this, it’s important to combine and compress your JavaScript. While there are useful standalone tools and HttpHandler based solutions to the problem already, none of them work quite how I prefer. Instead, I’m going to show you my dead-simple method for automatically compressing and combining script includes.

To accomplish that in this post, we will select a compression utility, learn how to use it at the command line, explore a useful automation feature in Visual Studio, and apply that to keep scripts combined and compressed with no ongoing effort.

Selecting a JavaScript compression tool

The first thing we’ll need is a utility to compress our JavaScript. There are many utilities available, ranging from YUI Compressor to Dean Edwards’ Packer, each with its own strengths and weaknesses.

YUI Compressor is powerful, but requires a Java runtime be available during the build process. Packer is popular for its Base62 encoding mode, however that form of compression carries a non-trivial performance tax on the client-side.

In terms of simplicity, it’s hard to beat Douglas Crockford’s JSMin. It requires no command line options, no runtimes or frameworks, and accepts input directly from standard input (which will be useful for us later).

One common concern about JSMin is that it outputs less compact code than YUI Compressor and Packer on their most aggressive settings. However, this is a bit of a red herring. When gzipped, the result of all three boil down to almost exactly the same size across the wire. Since you should always serve your JavaScript with gzip compression at the HTTP level, this initial “disadvantage” is moot.

Using JSMin from the command line

Using JSMin is very straightforward. For example, say we have the following, well-commented JavaScript and want to minify it:

// how many times shall we loop? 
var foo = 10;
// what message should we use? 
var bar = 'Encosia';
// annoy our user with O(foo) alerts! 
for (var i = 0; i < foo; i++) { 

Assuming that JavaScript is in a file called AlertLoop.js, this command line usage of JSMin will minify it and output it to the console:

jsmin < AlertLoop.js


What this does is run jsmin and feed the contents of AlertLoop.js into standard input. It’s the same as if you had run jsmin and then typed all that JavaScript on the command line.

Similarly, this usage does the trick if you want to redirect that output to a file:

jsmin < AlertLoop.js > AlertLoop.min.js


The minified output is less than half the size of the original. Not bad!

Note: If you’re wondering about the upper ASCII characters preceding the minified script, they’re nothing to be concerned about. Because I had created AlertLoop.js in Visual Studio, it was saved as UTF-8 by default and those characters are the UTF BOM (thanks to Oleg, Sugendran, and Bart for clarification).

Set up project directories

project-layoutBefore we get to the next steps, we need to define a structure for our project. The one shown to the right works for simple projects.

Within the website project, the important takeaway is that the JavaScript files to be compressed are all in the same directory and named with a *.debug.js pattern.

Outside of the website, notice the “tools” directory which contains a copy of JSMin. I think we can all agree that executables should not be included within a website project if possible. That would just be begging for trouble.

However, I do suggest including an external tools directory and JSMin executable in your project’s source control. You never want to create a scenario where someone can’t perform a checkout and then a successful build immediately afterward.

Automation: Visual Studio earns its keep

To automate script compression as part of the build process, I suggest using a build event. There are perfectly legitimate alternatives, but I prefer having a tangible file sitting on disk and having that compression process automated. So, “building” the minified JavaScript include(s) as part of the build process makes the most sense to me.

Build events may sound complicated, but they aren’t at all. Build events are simply a mechanism for executing command line code before and/or after your project is compiled.

For our purposes, a post-build event is perfect. Additionally, we can specify that it should only run the build event if the project builds successfully. That way we avoid wasting unnecessary time on minifying the JavaScript when there are build errors.

Setting up a build event in Visual Studio

To add build events, right-click on your project and choose properties. In the properties page that opens, click on the “Build Events” tab to the left. You’ll be presented with something similar to this:


Note: If you’re using Visual Basic, there will be no build events tab in the project properties. Instead, look for a build events button on the “Build” tab, which allows access the same functionality.

You can type commands directly in the post-build field if you want, but clicking the “Edit Post-build” button provides a better editing interface:


The interface’s macro list is especially useful. In particular, the ProjectDir macro will be handy for what we’re doing. $(ProjectDir) placed anywhere in a build event will be replaced with the actual project path, including a trailing backslash.

For example, we can use it to execute JSMin.exe in the hierarchy described above:


Or, reference that same project’s js directory:


Putting it all together: Minify a single file

Now that we’ve covered how to use JSMin at the command line and how to execute command line scripts as part of Visual Studio builds, putting it all together is easy.

For example, to minify default.debug.js, this post-build event will do the trick:

"$(ProjectDir)..\tools\jsmin" < 
"$(ProjectDir)js\default.debug.js" > 

(The line breaks are for readability here. The command in your actual build event must not contain them, or it will be interpreted as separate commands and fail.)

The quotes are important, in case $(ProjectDir) happens to include directories with spaces in their names. Since you never know where this project may eventually be built at, it’s best to always use the quotes.

*Really* putting it together: Combine files

I did promise more than just compression in the post’s title. Combining scripts is just as important as compression, if not more so. Since JSMin takes its input from stdin, it’s easy to roll scripts together for minification into a single result:

type "$(ProjectDir)js\*.debug.js" | 
"$(ProjectDir)..\tools\jsmin" > 

This build event would combine all of our *.debug.js scripts, minify the combined script bundle, and then output it in a new file named script-bundle.min.js.

This is great if you want to combine your most commonly used jQuery plugins into a single payload, for example. A reduction in HTTP requests usually provides a nice improvement in performance. This is especially true when you’re dealing with JavaScript, because the browser blocks while script references load.

Dealing with dependencies

Cross-dependencies between scripts is one issue that requires extra consideration when combining. Just the same as ordering script includes incorrectly, bundling scripts together in the wrong order may cause them to fail.

One relatively easy way to handle this is to give your scripts prefixes to force the correct order. For example, the source sample below includes this set of JavaScript files:


Combining these and referencing the result will fail, because default.debug.js is sorted ahead both jQuery and the plugin by default. Since default.debug.js depends on both of those, this is a big problem. To fix this, rename the files with prefixes:


Now it will work perfectly.

Any system of alphanumeric prefixes will work, but be sure to pad numbers with leading zeroes if you use a numeric system. Otherwise, the default sort ordering may catch you off guard (e.g. 2-file.js sorts ahead of 11-file.js through 19-file.js).

To debug, or not to debug

Now that we have the minification process under control, one final issue to address is how to keep this from complicating our development workflow.

While editing these scripts, we certainly don’t want to be forced to recompile every time we make a change to the JavaScript. After all, one of the nice things about JavaScript is that it doesn’t require precompilation. Even worse, using a JavaScript debugger against minified files is a nightmare I wouldn’t recommend to anyone.

The easiest way I know of to ensure that the correct scripts are emitted for both scenarios is to check the IsDebuggingEnabled property of the HttpContext:

<% if (HttpContext.Current.IsDebuggingEnabled) { %>
  <script type="text/javascript" src="js/01-jquery-1.3.2.debug.js"></script>
  <script type="text/javascript" src="js/05-jquery-jtemplates.debug.js"></script>
  <script type="text/javascript" src="js/10-default.debug.js"></script>
<% } else { %>
  <script type="text/javascript" src="js/js-bundle.min.js"></script>
<% } %>

When the web.config’s compilation mode is set to debug, the *.debug.js versions of the files are referenced, and the auto-minified bundle otherwise. Now we have the best of both worlds.


I hope you’ll find that this technique is a good compromise between the tedium of using manual minification tools and the overwrought complexity of setting up some of the more “enterprisey” automation solutions.

One not-so-obvious benefit that I’ve noticed stems from minification’s automatic comment stripping. Without worry about your comments burdening the size of the client-side payload or being distributed across the Internet, you’re more likely to comment your JavaScript well. Dealing with a dynamic language, sans-compiler, I find that comments are often crucial to maintainability.

This is one of those problems with quite a few perfectly legitimate solutions. What do you think of this solution? How do you normally handle this?

Get the source

For demonstration, I took my jQuery client-side repeater example and applied this technique. Having several JavaScript includes (one that’s full of comments), it’s a perfect candidate for combining and compression.

One particular thing to notice in this example is the use of numeric prefixes to order the JavaScript includes, as mentioned earlier. This naming scheme is crucial when dealing with interdependent scripts. If the scripts are combined in the wrong order, your functionality will break just the same as if you had used script reference tags in the wrong order.

Download Source: jsmin-build.zip


Originally posted at Encosia. If you're reading this elsewhere, come on over and see the original.

Automatically minify and combine JavaScript in Visual Studio

Read the original blog entry...

More Stories By Dave Ward

Dave Ward wrote his first computer program in 1981, using good ‘ol Microsoft Color BASIC and cassette tapes for data storage. Over the years since then, he has had the opportunity to work on projects ranging from simple DOS applications to global telecommunications networks spanning multiple platforms.

Latest Stories
SYS-CON Events announced today that Loom Systems will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Founded in 2015, Loom Systems delivers an advanced AI solution to predict and prevent problems in the digital business. Loom stands alone in the industry as an AI analysis platform requiring no prior math knowledge from operators, leveraging the existing staff to succeed in the digital era. With offices in S...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
SYS-CON Events announced today that HTBase will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. HTBase (Gartner 2016 Cool Vendor) delivers a Composable IT infrastructure solution architected for agility and increased efficiency. It turns compute, storage, and fabric into fluid pools of resources that are easily composed and re-composed to meet each application’s needs. With HTBase, companies can quickly prov...
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership abi...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists l...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., will discuss how these tools can be leveraged to develop a lasting competitive advanta...