Developing on Staxmanade

Excited to Announce My New Career Opportunity

I have lots of bittersweet and mushy things to say about my previous employer of 5 years, however I’ll spare you from all that.

Last Monday (10/17/2011) I started my first official day with

(That’s Vertigo)

I’m really looking forward to working with the many great people at Vertigo and all the interesting projects that will come my way.

With regards to this blog and my other OSS contributions, I don’t have any plans to change anything. I’m looking forward to possibly blogging about some different and interesting topics, but we’ll have to see.

Happy Coding!

Chocolatey - The free and open source windows app store.

If you haven’t heard of it, you’re about to be delighted. Every developer (at least those on Windows) should know about this project, if for nothing more than to make life setting up your dev machine a piece of cake.

What is Chocolatey?

Straight from the Chocolatey.org site:

Chocolatey NuGet is a Machine Package Manager, somewhat like apt-get, but built with windows in mind.

If you’ve not ever used a linux machine or understand what the power of an “apt-get” like tool is, well It’s basically the simplest possible way to install an application to your machine.

What could be simpler than finding the app’s website, download the app, and next, next, next through the the installer? How about just typing “chocolaty install notepadplusplus” at a powershell command prompt? That simple little command will download and install Notepad++ right on your machine with virtually no need to interact with the installer. AWESOME!!!

Disclaimer!

I know there are other installer applications out there that aggregate and install different  programs; however, to be honest, I don’t use any of them. I am also going to assume that most of them aren’t catered to the windows developer (maybe I’m wrong). Either way I like this project and I’m just trying to share it with the community. So There…

Ok, my above salesmanship is a little loud mouthy, but maybe your interest is peaked enough to give it a try.

How do I install Chocolatey?

It’s about as simple to install Chocolatey as it is to use Chocolatey to install other applications. One single powershell command. Just paste the below command in your powershell prompt and let er rip.

First, make sure you have your powershell environment set to “Unrestricted”.image

Run the Chocolatey install.

  iex ((new-object net.webclient).DownloadString("http://bit.ly/psChocInstall"))

Packages!

Now that you have chocolatey installed, head over to the Chocolatey.org website and browse the packages you can now install.

EX:

C:\Code>chocolatey install notepadplusplus

How can I know when new packages are added to the Chocolatey.org feed?

You can get more background on this approach by following my previous post What’s happening on the NuGet feed (leveraging OData in an RSS reader)

The direct RSS link I have is as follows:

http://chocolatey.org/api/feeds/Packages()?$filter=Id%20ne%20'SymbolSource.TestPackage'&$orderby=Published%20desc

Plug that into your RSS reader and you should be notified when new packages are added to the Chocolatey.org feed.

Happy Setting up your Dev Machine!

What’s happening on the NuGet feed (leveraging OData in an RSS reader)

Ever since NuGet came online I’ve been wanting a way to find out about new packages, and updates to packages.

Since OData extends the ATOM feed and you can hook an OData feed up to any RSS reader I set out to find a way to get at those recent updates to the NuGet feed and find out when new packages were published.

If you’re not completely familiar with OData, later in this post I explain how I arrived at the URL below. However, if you don’t care how I arrived at the solution, below is the final RSS link I’m currently using in my RSS reader (Google Reader) to monitor updates to the NuGet feed.

http://packages.nuget.org/v1/FeedService.svc/Packages()?$filter=Id%20ne%20'SymbolSource.TestPackage'&$orderby=Published%20desc

How did I discover or build that URL?

I could have memorized the OData URI spec and constructed the above link by hand but I’m far more familiar with C# and LINQ and instead used LINQPad.

Open up LINQPad and add a WCF Data Services (OData) connection to the following URL

http://packages.nuget.org/v1/FeedService.svc

Now You can query the OData feed with some LINQ.

from p in Packages
where p.Id != "SymbolSource.TestPackage"
orderby p.Published descending
select p

When you execute this LINQ query in LINQPad, you can click on the “SQL” in the results pane to view the URL that was generated to execute the operation.

Now my original linq expression didn’t have the where p.Id != "SymbolSource.TestPackage" as I didn’t know this package would become a regular pain to view in the RSS Reader.

One great thing about OData is the ability to re-craft the URL for this feed to ignore items that either show up so much that I want to exclude them (like the “SymbolSource.TestPackage”) or a certain class of items that I just don’t want to be alerted on (maybe filtering by the NuGet Tag property).

Some observations of the feed.

I’ve been monitoring this feed for almost a month now and have learned about some very interesting projects, check the next section of this post with a list of some of the more interesting ones (to me) that I’ve found.

So far the feed has become just a regular part of my daily blog reading. It’s the quickest one to do as I’m typically skimming the titles of the RSS items and only slowing down to dig into projects I’ve never heard and sound interesting.

Google Reader trends that this feed generates about 54 items a day. Which may seem like a lot, but it’s really easy to click the “mark all as read” button and go on with the day.image

Interesting projects I’ve discovered.

I’ve been watching this RSS feed for almost 2 weeks now, and have discovered some new projects (at least new to me) and learned about updates to projects I already knew about.

Below is a small list of ones I thought were interesting – there’s way more being done out there.

New to me
NOT new to me (but released while I was watching)

Could NuGet be a new metric for what’s popular or up and coming?

That heading is a little bolder than what I actually think, mostly because there are far too many variables to make that statement hold a strong footing. Regardless, I have noticed some interesting “trends” (if you can define a trend by my watching the feed for about a month) in what is being released on NuGet and wonder if watching this over time will be a nice window in to the types of projects people are really working on.

I’ve seen quite a few projects related to messaging or Event Sourcing. And a number of different JS and CSS minification/build tooling projects.

Comments

Thomas Ardal
Thanks for sharing. I wasn't aware, that existing RSS readers would actually be able to understand OData, even though it's an extension to Atom.

By the way, there's already a project for watching NuGet packages through RSS here: NuGetFeed.org.

With NuGetFeed.org you will be able to follow your favorite NuGet packages through RSS. There's a feature called MyFeed, where you will be able to add a list of packages you want to follow. If you use Google Chrome, there's an extension as well and finally a Visual Studio add-in is also available. Hope you will find NuGetFeed.org useful.

Powershell Text-To-Speech and fun with a 4yr old.

I’m not so sure this fits in the “elegant code” theme, but it’s a “fun with code” topic that someone might enjoy. Especially if you have a little one.

My 4yr old is learning how to spell small and simple words like her name, “Mom”, “Dad”, etc, and continuing her exploration with letters on the keyboard. She’s been banging on a keyboard since her early years on babysmash. In fact I came home one day to find my monitor turned 90 degrees and about every possible admin window open in the background because of certain key combinations were not trapped by babysmash. But I digress…

For a while she was typing some text into notepad and asking me what it spelled.

“ajlkjwelsl” –> What’s that spell daddy?

I then thought it would be fun if the computer could give instant feedback about what she typed and in a matter of a minute or so I whipped up this little “game” which we had fun playing for a bit.

You can view the gist here - https://gist.github.com/1180060

Just paste the function above as shown below and run it. Type some text (make sure your computer’s sound is on) and press enter to hear it.

image

I typed some of the usual things we say around the house and my 4yr old wouldn’t stop laughing…

Give it a try with your little ones (or big ones). Even let your non-techie significant other have a go – he/she may have some fun with it.

Slightly modified “CD” Command for Powershell

Background

In my previous job, I spent all my development time in a Linux environment. Was rather impressed at how much could get done at the command line, and how efficient some of those tasks became. My next job was based on Windows and mostly the Microsoft stack of development tools. This  meant I pretty much left the command line behind. That was, until, I started using git. And since I wanted to learn PowerShell, I used PowerShell to execute my git commands.
One thing that has bugged me for a while is simply moving between directories. Even with tab completion, all that typing is a still quite annoying. Especially if you jump between a set of similar directories. One feature from the Linux CD command that I missed was “CD -". This command in Linux can be used to jump to the previous directory (and then back again). One limitation of this command is it only could jump back to the previous directory, and it did not retain a memory of recent directories. There may be something better in Linux that I don’t know of, but I’m basing this on a limited experience a number of years ago.
So I threw a question out on twitter.
image_thumb6
After several tweets back and forth, @cwprogram threw an interesting spike at me.
image_thumb9[4]  http://pastebin.com/xwtkn0am
Although this wasn’t exactly what I was looking for, it contained enough of what I needed to spark my curiosity to write a version of my own.
And so a little script was born that I’m now using to replace the “CD” command in my PowerShell runtime.

What does this do?

After you get it installed (see install steps below), when you type “CD” with no parameters at the command prompt. It will list up to 10 of the most recent distinct paths you’ve been to recently. This list also gives an index lookup number that you can use as a shortcut to jump to that path.
Example:
C:\code> cd
     1) C:\Users\jasonj
     2) D:\temp
C:\code> cd 2
D:\temp>
You can continue to use the “CD” command to do your usual changing directories. Now you can quickly get a history of where you’ve been, and quickly jump to any of those previous histories without typing the entire paths again.
It defaults to only showing you the last 10 distinct items, but if you find yourself needing to go back farther than that, you can use the following command to list more than 10 items.
D:\temp> cd -ShowCount 100

How to Install

  1. Download the file and save it to a location you can reference later.
    https://github.com/staxmanade/DevMachineSetup/blob/master/GlobalScripts/Change-Directory.ps1
  2. Open your $PROFILE (What is that?)
  3. Type the following two commands into your profile to replace the existing “CD” command with the new one.

    Remove-Item alias:cd
    Set-Alias cd {Your_Saved_Directory}\Change-Directory.ps1
  4. Exit your PowerShell console and start a new one up.

Happy Commanding!

Comments

Jason.Jarrett
Thanks for the tip.
Bartek Bielawski
If you want some more *nix features, including cd - you may want to look at pscx.codeplex.com - I made mistake of ignoring what it has to offer and also re-invented cd- in my module. ;) IMO pscx should be added to win build by default. ;)

StatLight 1.4 and almost 1.5

It’s been a while since I blogged about any updates to the StatLight project and even though people are saying “SilverLight’s dead” I’d have to say there’s been more community contribution in the last few months to the project than there’s ever been.

What is StatLight?

For those of you who don’t know, it’s a simple little command line tool you can use to execute tests for SilverLight test projects. You can get some more information on the project at the project’s home page and documentation page.

 

Release of StatLight 1.4

It’s been several months since I posted the 1.4 release of StatLight. It was full of all kinds of goodies. Go checkout the release page to see what updates were included.

There was a regression introduced with this release that caused U.I. tests in TeamCity to not report correctly. It’s since been fixed and you can use the CodeBetter TeamCity server to pull down the latest build of StatLight if you need this fix. If you hit the login page, just click on the login as a guest link to access the TeamCity builds.

 

Release of StatLight 1.5 (soon, maybe, sometime)

imageIf you are interested in what is to come with the next release, you can head over to the “planned” tab and check out some of the new features.

But I’ll write about some of them here anyway Smile with tongue out

 

Community Contribution.

xUnitContrib Test Runner

I’d like to say thanks to Remo Gloor for his contribution of the official xUnitContrib Silverlight test runner. Remo created a StatLight runner host that leverages the xUnitcontrib Silverlight test runner to execute xUnit tests. This provides some great xUnit support and is considerably faster than the xUnitLight adapter originally implemented.

 

Growl Plugin

Geir-Tore Lindsve leveraged the new extensibility model recently added to StatLight to create a plugin that would notify Growl of failing tests. You can check out the project at https://github.com/lindsve/Statlight.Growl

 

ReSharper 6

ReSharper support has been around for a while now with the AgUnit project. I was recently contacted by Steven Kock who wanted to see if it were possible to dump his custom Silverlight Test runner and leverage StatLight. I’ve been working with him on this, and am really excited about the value add we get from having Steven push StatLight around. Looking forward to his suggestions, and who knows – we may get some much better performance out of StatLight. Even though it’s still under dev, I’m just stoked can’t say how awesome it was to open up a Silverlight test project and execute a single test by using my R# shortcut.

Git on Windows: Creating a network shared central repository.

I was doing some basic Git training for a customer this past week and they asked about how to setup their repositories to push/pull from a network share. I thought it would be simple and we spent a few minutes in class trying to accomplish it. We stopped trying in class and I took it as a homework assignment to figure it out before the next lesson. It was a little bit of a struggle to get this working for me, so I thought I’d throw this out there for any windows developers trying to do a similar thing.

 

I tend to prefer the command line to any of the git UI tools (except when visualizing history, and diffing files). In this post I’m going to show how you can do it through a command line, but I’ll also show how you can do it with git gui which, in this case, is a few less steps.

 

How to push a local repository up to an (un-initialized) remote windows share.

 

Command Line:

I tend to run git within PowerShell, however the following set of commands cannot be run within the PowerShell prompt. If you figure out a way, I’d love to hear about it. And since I use the PowerShell prompt, I’m not sure how this would play out with the bash command.

Open a command prompt (cmd.exe) and follow the below steps to create a remote windows repository share.

CD into the context of your local repository. Say my repo was at “C:\Code\MyGitRepo1”.

cd C:\Code\MyGitRepo1 

Next we’re going to change our current directory to the remote share location.

Something I learned during this process is that cmd.exe doesn’t allow you to “cd” into a UNC network share path.

To get around not being allowed to “cd” into a UNC network share we’ll use the pushd command. The reason this works is because it is actually going to map a network drive to the network share location.

pushd \\remoteServer\git\Share\Folder\Path

Now that we’re in the remote location we can create a bare git repository.

mkdir MyGitRepo1
cd MyGitRepo1
git init --bare

Your remote empty repository has now been created. Let’s go back to our local repository

popd

popd will “CD” back to the previous location (“C:\Code\MyGitRepo1”) and also remove the network share the pushd command created above.

So we should be back in the context of our local git repo.

C:\Code\MyGitRepo1\ >

 

Now all we need to do is add the newly created remote bare repository to our local repo and push our code up.

Notice the direction of the slashes in the path below (this stumped me for a bit) 

git remote add origin //remoteServer/git/Share/Folder/Path/MyGitRepo1
git push origin master

Kind of a pain at the command prompt really, but it’s not something that’s done all that often.

Using Git gui instead:

Open up the GUI

git gui

Click the [Remote->Add] menu option to bring up the “Add Remote” dialog.

image

Enter the name for your remote “origin” is pretty typical for the central repository, but you can call this whatever you want. Then type the remote location. Notice the direction of the slashes.

image

Now you should be good to go.

 

Hope this helps someone else, and if anyone knows of a better/easier way I’d love to hear it.

Comments

Jason.Jarrett
Thanks for the nice comment. In regards to your question, I'm afraid I'm not familiar enough to be of help.

You might take your question over to StackOverflow.com as there are some very smart people over there that might be able to help.

Good Luck
Djilali Tabbouche
Hi Jason, thanks for your post.
I'm using this exact setup to deploy applications to both linux and windows server.
No problem on linux using ssh and pushing to windows throuh network shares works fine but I have one issue with post-receive hooks: I use this hook to checkout the remote repository to the application directory and run configuration tasks and on windows, the git command use the local computer environment (git-dir and work-tree).
I've tried every options without success.

Any idea?

Using VSDBCMD to deploy an Entity Framework (EF) CodeFirst (or any other) database to AppHarbor

If you’ve taken the jump to try out the new Entity Framework Code First and you’re allowing it to generate your database for you, you’ve most certainly run into the lack of migrations/updating existing schema support. Currently EF Code First will only create a database and won’t update a database with changes necessary to bring it in line with your model. I know they’re working on it, but since it’s not there, I thought I’d share a possible solution, albeit less polished than some of the well known database change management out there.

Where is the tool?

You can access it in the VS Command window. On my x64 machine the tool is in.

C:\Program Files (x86)\Microsoft Visual Studio 10.0\VSTSDB\Deploy\vsdbcmd.exe

I want to deploy an existing schema to AppHarbor.

Some high level steps that you can use for deployment of database changes.

  1. Generate an original reflection of your database. (*.dbschema file)
  2. Tiny little hack to the .dbschema file.
  3. Generate the change file to AppHarbor
  4. Review Change Script Generated
  5. Take the app offline. (optional)
  6. Apply Change Script
  7. Bring the app online (mandatory if you took step 5)
Generate an original reflection of your database.

This file is a complete reflection of your databases schema in a single xml file.

The following command can be used to generate this file.

vsdbcmd.exe
     /Action:Import
     /ConnectionString:"Data Source=.\sqlexpress;Initial Catalog=MyDatabase;Integrated Security=True;Pooling=False"
     /ModelFile:MyDatabase.dbschema

There are a ton of knobs to turn with this command line tool. Feel free to check out the docs http://msdn.microsoft.com/en-us/library/dd193283.aspx

Now you should have a file “MyDatabase.dbschema” sitting on your hard drive.

Tiny little hack to the .dbschema file.

The section of xml we want to manually remove from the file is related to where your mdf and ldf database files should exist on disk. When we go to deploy up to AppHarbor, if this is not removed, then vsdbcmd will generate script to attempt to move the files into the “correct” location. This operation will throw exceptions if you attempt to execute against AppHarbor as you don’t have permission to do this. We’re removing it from the xml file, as I can’t seem to get the correct command line option to ignore this (if there is an option). So by removing it, it’s just not used and completely ignored.

I don’t know if this will be true for everyone, but I find that the last two sections of xml in the dbschema file are all I have to remove. I’ll show the two full sections below so you can use it as a reference of what to remove from the file.

<Element Type="ISql90File" Name="[MyDatabase]">
<Property Name="FileName" Value="$(DefaultDataPath)$(DatabaseName).mdf" />
<Property Name="Size" Value="2304" />
<Property Name="SizeUnit" Value="3" />
<Property Name="FileGrowth" Value="1024" />
<Property Name="FileGrowthUnit" Value="3" />
<Relationship Name="Filegroup">
<Entry>
<References ExternalSource="BuiltIns" Name="[PRIMARY]" />
</Entry>
</Relationship>
</Element>
<Element Type="ISql90File" Name="[MyDatabase_log]">
<Property Name="FileName" Value="$(DefaultLogPath)$(DatabaseName)_log.LDF" />
<Property Name="Size" Value="576" />
<Property Name="SizeUnit" Value="3" />
<Property Name="MaxSize" Value="2097152" />
<Property Name="IsUnlimited" Value="False" />
<Property Name="FileGrowth" Value="10" />
<Property Name="FileGrowthUnit" Value="1" />
<Property Name="IsLogFile" Value="True" />
</Element>


Generate the change file to AppHarbor.


Now that we have a .dbschema file containing the complete model of what we want deployed, we can now use it to generate a schema change deployment script.




vsdbcmd.exe

     /Action:Deploy


     /DeployToDatabase:-


     /Script:Test.sql


     /ConnectionString:"{YourAppHarborConnectionString}"


     /ModelFile:HackIt.dbschema


     /Properties:TargetDatabase={YourAppHarborDatabaseName EX:db1235}




I’ll explain a couple of the above command options.



/DeployToDatabase:-


This one is _key_. This tells vsdbcmd to only generate a change script, and not to actually deploy the changes immediately. Until you feel comfortable with what sql the tool generates, which is usually pretty darn good, you should not apply it immediately. Allow the tool to generate the file for further inspection and you can execute it manually after.




/Script:Test.sql


This this is just the name of the file to dump the deployment changes.



/ModelFile:HackIt.dbschema


The path to the .dbschema we generated and modified above.



 



Review Change Script Generated.


After you’ve generated a change script file, take a look at the sql just to make sure you’re happy with what it generates.



Take the app offline. (optional)


This one depends on the schema changes. If the changes are serious enough, you can check in an App_Offline.htm file at the root of web project and do a “git push appharbor”. This way, while making schema changes you don’t have to worry about the errors popping up on users. Down side is your site becomes inoperable.




If you’ve never heard of the App_Offline.htm – I’d recommend reading up on it. http://weblogs.asp.net/scottgu/archive/2006/04/09/442332.aspx




Apply Change Script.


You have several options to actually apply the scripted changes.




  1. Use vsdbcmd to deploy – Just turn the /DeployToDatabase:- to /DeployToDatabase:+ and allow vsdbcmd to apply the script right there.


  2. Use SQL Management Studio.

    Make Sure you turn on SQLCMD Mode

    image
     



Bring your site back online.


Now you can go re-name the App_Offline.htm to something like App_Offline.htm.disabled and push those changes back up to AppHarbor.



Other considerations.





One great benefit of to this approach is the ability for vsdbcmd to manage changes to an existing schema.



Now if you want full support like refactorings such as table, column, etc renames. You will want to keep a full db project and use that to do a deployment.



Hope you find this useful. Happy Deployment!

Dynamically load embedded assemblies – because ILMerge appeared to be out.

At work, I started building a .net assembly that would probably find its way into a number of the server processes and applications around the shop. This particular assembly was going to end up containing quite a number of external open source references that I didn’t want to expose to the consumer of my library.

I set out to solve several simple requirements.

  1. Easy to use. Should be nothing more than adding a reference to the assembly (and use it).
  2. Consumer should not have to deal with the 5 open source libraries it was dependent on. Those are an implementation detail and it’s not necessary to expose those assemblies to the consumer, let alone have to manage the assembly files.

I originally got the idea from Dru Sellers’ post http://codebetter.com/blogs/dru.sellers/archive/2010/07/29/ilmerge-to-the-rescue.aspx

I gave ILMerge a try. As a post build event on the project – I ran ILMerge and generated a single assembly. Leveraging the internalize functionality of ILMerge so my assembly wouldn’t expose all of its open source projects through Visual Studio’s intellisense.

This almost gave me the output I wanted. Single assembly, compact, easy to use… Unfortunately, when I tried to use the assembly I started seeing .net serialization exceptions. Serialization from my ILMerged assembly could not be desterilized on the other end because that type was not in an ILMerged assembly, but in the original assembly. (Maybe there’s a way to work around this, but I didn’t have time to figure that out, would love to hear any comments)

So ILMerge appeared to be out, what next?

My coworker, Shawn, suggested I try storing the assemblies as resource files (embedded in my assembly). He uses the SmartAssembly product from Red Gate in his own projects, and mentioned that their product can merge all of your assemblies into a single executable – storing the assemblies in a .net resource file within your assembly/executable. This actually seemed easy to accomplish so I thought I’d give it a try.

How I did it.

Step 1: Add the required assemblies as a resource to your project. I choose the Resources.resx file path and added each assembly file to the Resources.resx. I like this because of how simple it is to get the items out.

Step 2: We need to hook up to the first point of execution (main(…), or in my case this was a library and I had a single static factory class, so in the static constructor of this factory I included the following lines of code.

static SomeFactory()
{

var resourcedAssembliesHash = new Dictionary<string, byte[]> {
{"log4net", Resources.log4net},
{"Microsoft.Practices.ServiceLocation", Resources.Microsoft_Practices_ServiceLocation},
};

AppDomain.CurrentDomain.AssemblyResolve += (sender, args) =>
{
// Get only the name from the fully qualified assembly name (prob a better way to do this EX: AssemblyName.GetAssemblyName(args.Name))
// EX: "log4net, Version=??????, Culture=??????, PublicKeyToken=??????, ProcessorArchitecture=??????" - should return "log4net"
var assemblyName = args.Name.Split(',').First();

if (resourcedAssembliesHash.ContainsKey(assemblyName))
{
return Assembly.Load(resourcedAssembliesHash[assemblyName]);
}

return null;
};
}


I’ll talk a little about each step above.



var resourcedAssembliesHash = new Dictionary<string, byte[]> {
{"log4net", Resources.log4net},
{"Microsoft.Practices.ServiceLocation", Resources.Microsoft_Practices_ServiceLocation},
};


The first chunk is a static hash of the (key=assembly name, value=byte array of actual assembly). We will use this to load each assembly by name when the runtime requests it.



AppDomain.CurrentDomain.AssemblyResolve += (sender, args) =>
{...


Next we hook into the app domain’s AssemblyResolve event which allows us to customize (given a certain assembly name) where we load the assembly from. Think external web service, some crazy location on disk, database, or in this case a resource file within the executing assembly.



// Get only the name from the fully qualified assembly name (prob a better way to do this EX: AssemblyName.GetAssemblyName(args.Name))
// EX: "log4net, Version=??????, Culture=??????, PublicKeyToken=??????, ProcessorArchitecture=??????" - should return "log4net"
var assemblyName = args.Name.Split(',').First();


Next we figure out the name of the assembly requesting to be loaded. My original implementation used the …Name.Split('’,’).First(); to get the assembly name out of the full assembly name, but as I was writing up this blog post I thought – there must be a better way to do this. So although I am putting the effort to write this out – I’m not feeling like verifying that a possible better way will work (So give this a try and let me know – try using AssemblyName.GetAssemblyName(args.Name) instead).



if (resourcedAssembliesHash.ContainsKey(assemblyName))
{
return Assembly.Load(resourcedAssembliesHash[assemblyName]);
}


Next we check that the assembly name exists if our hash declared initially and if so we load it up…



    return null;
};


Otherwise, the assembly being requested to be loaded is not one we know about so we return null to allow the framework to figure it out the usual ways.



Step 3: Finally, I created a post build event that remove the resourced assemblies from the bin\[Debug|Release] folders. This allowed me to have a test project that only had a dependency on the single assembly and verify using it actually works (because it has to load it’s dependencies to work correctly and they didn’t exist on disk).



Please consider.




  • You may not have fun if you package some of the same assemblies that your other projects may/will reference (especially if they are different versions).


  • Can’t say I have completely wrapped my head around the different problematic use cases related strategy could bring to life. (Use with care)

Bookmark to inject FireBug Light into Internet Explorer.

I’ve enjoyed Firebug in Firefox, and even find value in Firebug light when used in Internet Explorer. However if you don’t have control or don’t want to place the Firebug installation js file in your web site to include firebug. I figured out a way to load it on demand with a bookmark in Internet Explorer.

I created a new text file in my windows7 machine FireBug.url and placed it in
C:\Users\{username}\Favorites\{WhateverFolderYouWant}

Then paste the following into the file and save.

NOTE: I’m using jQuery in the javascript link – so if you need it more generic you’ll have to replace the jQuery…

[DOC_FirebugUI]
ORIGURL=about:blank
[{000214A0-0000-0000-C000-000000000046}]
Prop3=19,2
[InternetShortcut]
URL=javascript:$('body').append('<script type="text/javascript" src="https://getfirebug.com/firebug-lite.js"></script>'); void(0);
IDList=
IconFile=http://getfirebug.com/img/favicon.ico
IconIndex=1

You should now be able to load up firebug in IE with a single click.

Note: I have the IconFIle: property in there, but can’t seem to get it to work..

image

image