Global text highlighting in Sublime Text 3

Sublime Text is currently my favorite text editor that I use whenever I have to leave Visual Studio and this post is about how make it highlight ISO dates in any text file regardless of file format.

Sublime Text obviously has great syntax coloring support for highlighting keywords, strings and comments in many different languages but what if you want to highlight text regardless of the type of file you’re editing? In my case I want to highlight dates to make it easier to read logfiles and other places where ISO dates are used, but the concept is general. You might want to indicate TODO och HACK items and whatnot and this post is about how to do that in Sublime Text.

Here’s an example of what we want to achieve:

Log file with highlighted date

Here is the wanted result, a log file with a clearly indicated date, marking the start of a logged item.

 

We’re going to solve this using a great Sublime package written by Scott Kuroda, PersistentRegexHighlight. To add highlighting of dates in Sublime text, follow these steps:

  1. Install the package by pressing Ctrl + Shift + P to open the command palette, type “ip” and select the “Package Control: Install Package” command. Press Enter to show a list of packages. Select the PersistentRegexHighlight package and press Enter again.
  2. Next we need to start configuring the package. Select the menu item Preferences / Package Settings / PersistentRegexHighlight / Settings – User to show an empty settings file for the current user. Add the following content:
    
    {
       // Array of objects containing a regular expression
       // and an optional coloring scheme
       "regex":[
         {
           // Match 2015-06-02, 2015-06-02 12:00, 2015-06-02 12:00:00,
           // 2015-06-02 12:00:00,100
           "pattern": "\\d{4}-\\d{2}-\\d{2}( \\d{2}:\\d{2}(:\\d{2}(,\\d{3})?)?)?",
           "color": "F5DB95",
           "ignore_case": true
         },
         {
           "pattern": "\\bTODO\\b",
           "color_scope": "keyword",
           "ignore_case": true
         }
       ],
    
       // If highlighting is enabled
       "enabled": true,
    
       // If highlighting should occur when a view is loaded
       "on_load": true,
    
       // If highlighting should occur as modifications happen
       "on_modify": true,
    
       // File pattern to disable on. Should be specified as Unix style patterns
       // Note, this looks at the absolute path to match the pattern. So if trying
       // ignore a single file (e.g. README.md), you will need to specify
       // "**/README.md"
       "disable_pattern": [],
    
       // Maximum file size to run the the PersistentRegexHighlight on.
       // Any value less than or equal to zero will be treated as a non
       // limiting value.
       "max_file_size": 0
    }
    
  3. Most of the settings should be pretty self-explanatory, basically we’re using two highlighting rules in this example:
    1. First we specify a regex to find all occurances of ISO dates (e.g. “2015-06-02″, with or without a time part appended) and mark these with a given color (using the color property).
    2. The second regex specifies that all TODO items should be colored like code keywords (using the color_scope property). Other valid values for the scope are “name”, “comment”, “string”.
  4. When saving the settings file you will be asked to create custom color theme. Click OK in this dialog.

Done! Now, when you open any file with content matching the regexes given in the settings file, that content will be colored.

Tips

  1. Sometimes it’s necessary to touch the file to trigger a repaint (type a character and delete it).
  2. The regex option is an array so it’s easy to add as many items we want with different colors.
  3. To find more values for the color_scope property, you can place the cursor in a code file of choice and press Ctrl + Alt + Shift + P. The current scope is then displayed in the status bar. However it’s probably easier to just use the color property instead and set the wanted color directly.

Happy highlighting!

/Emil

Devart LINQ Insight

I was recently approached by a representative from Devart who asked if I wanted to have a look at some of their products, so I decided to try out the LINQ Insight add-in for Visual Studio.

LINQ Insight has two main functions:

  • Profiler for LINQ expressions
  • Design-time LINQ query analyzer and editor

If you work much with LINQ queries you probably know that Visual Studio is somewhat lacking with functionality around LINQ queries by default so the functions that LINQ Insight offers should be pretty welcome for any database developer out there on the .Net platform (which should be pretty many of us these days). Let’s discuss the two main features of LINQ Insight in some more detail.

Profiling LINQ queries

If you’re using Entity Framework (LINQ Insight apparently also supports NHibernate, RavenDB, and a few others but I have not tested any of those) and LINQ it can be a little difficult to know exactly what database activity occurs during the execution of the applications. After all, the main objective of OR mappers is to abstract away the details of the database and instead let the developer focus on the domain model. But when you’re debugging errors or analyzing performance it’s crucial to analyze the database activity as well, and that’s what LINQ Insight’s profiling function helps with.

There are other tools to this of course, such as IntelliTrace in Visual Studio Ultimate, but since it’s only included in Ultimate, not many developers have access to it. LINQ Insight profiler is very easy to use and gives access to a lot of information.

To enable profiling, follow these steps:

  1. Make sure that IIS Express process is not started. Stop it if it is. (This assumes we’re using IIS Express of course. I’m not quite sure how to work with the full IIS server in conjunction with LINQ Insight.)
  2. Open the profiling window by selecting View/Other Windows/LINQ Profiler, or pressing Ctrl+W, F
  3. Press the “Start profiler session” button in the upper left corner of the window (it looks like a small “Play” icon)
  4. Start debugging your application, for example by pressing F5.
  5. Debugging information such as this should now start to fill the profiler window:
    The profiler displays all LINQ activity in the application.

    The profiler displays all LINQ activity in the application.

    As you can see, in this case we have several different contexts that have executed LINQ queries. For example, ApplicationContext is used by ASP.Net Identity and HistoryContext is used by Code First Database Migrations. Context is our application context.

  6. We can now drill down into the queries and see what triggered them and what SQL statements were executed.
    Drilling down into profiler data.

    Drilling down into profiler data.

    We can see the LINQ query that was executed, the SQL statements, duration, call stack, etc. Very useful stuff indeed.

Query debugger and editor

The other feature LINQ Insight brings into Visual Studio is to help writing LINQ queries and debug them. To debug a query, follow these steps:

  1. To open a query in the query editor, just right-click on it in the standard C# code editor window and select the “Run LINQ Query” option:

    To debug or edit a LINQ query, use the right-click menu.

    To debug or edit a LINQ query, use the right-click menu.

  2. If the query contains one or more parameters, a popup will be shown where values for the parameters can be given.
  3. Next, the query will be executed, and the results will be displayed:
    Query results are displayed in the Results tab.

    Query results are displayed in the Results tab.

  4. This is of course useful in itself, and even better is that the generated Sql statements are displayed in the SQL tab and the original LINQ query is in the LINQ tab, where it can be edited and re-executed, after which the Sql and Results tab are updated. Really, really useful!

If an error is displayed in the Results tab, then the most probably reason is that the database could not be found in the project’s config file, or that it could not be interpreted correctly. The latter is the case if using the LocalDB provider with the "|DataDirecory|" placeholder, which only can be evaluated at runtime in a ASP.Net project. To make LINQ Insight find a database MDF file in App_Data in a web project, you can follow these steps:

  1. Make sure that your DbContext sub-class (for Entity Framework, that is) has an overloaded constructor that takes a single string parameter, namely the connection string to use:
    public Context(string connString) : base(connString) {}
    

    This is required if LINQ Insight cannot deduce the connection string for the project’s config file. This is usually a problem in my projects since I like to separate domain logic into a separate project (normally a class library) from my “host application”.

  2. Double-click the MDF file in the App_Data folder to make sure it’s present in the Server Explorer panel in Visual Studio.
  3. Select the database in the Server Explorer and right-click it and select Properties. Copy its Connection String property.
  4. In the LINQ Interactive window, click the Edit Connection String button, which is only enabled if the DbContext class has a constructor overload with a connection string parameter, which we ensured in step 1.
  5. Paste the connection string to the Data/ConnectionString field in the panel:
    Use the connection string dialog to override the "guessed" connection string.

    Use the connection string dialog to override the “guessed” connection string.

    Click OK to close the dialog.

  6. Re-run the query with the Run LINQ Query button in the LINQ Interactive window, and it should now work correctly. If it doesn’t, try to Run LINQ Query command in the C# code editor again, since it re-initializes the query.

The ability to freely set the connection string should make it possible to work against any database, be it a local MDF file, a full SQL Server database or a Windows Azure database. This could be used as a simple way to try out new or modified LINQ queries against a staging or production database, right from the development enviroment. Could be very useful in some situations for debugging nasty errors and such.

Summary

All in all, I think the LINQ Insight is a very useful tool and I recommend you try it out if you find yourself writing LINQ queries from time to time.

I should also mention that if you have tried LINQ Insight before and found it be slightly unstable then I should mention that Devart have recently fixed a few errors that really makes the tool much more robust and useful. If unsure, just download the trial version and test it out.

Happy Linqing!

Emil

AutoHotkey – the essential tool for all “automate-it-all” practioners

AutoHotkey logo

Introduction

This post is about a tool I always feel very crippled without, namely the incredibly useful AutoHotkey. I always install this onto new machines and I rely on it heavily every day. The strange thing is that rather few people I meet know about it so I thought I’d remedy that ignorance somewhat by posting an overview here on my blog.

So what is it? AutoHotkey allows you to map keypresses to custom macros.

This key press mapping works in any application by default and the macros that are possible are very flexible.

Some examples

The most common use is probably to start applications for a given key combination. Here’s one I use:

!#n::Run Notepad++

This starts Notepad++ when I press Ctrl-Alt-N

You can also write a little more complex macro code (the macro language is a custom one for AutoHotkey, here’s some documentation)

^!Enter::
	FormatTime, CurrentDateTime,, yyyy-MM-dd
	EnvGet, UserName, UserName
	SendInput /%CurrentDateTime% %UserName%
	return

This inserts a timestamp with my username when I press Ctrl-Alt-Enter.
Example: /2013-11-02 emila

Note that this works in any application since AutoHotkey sends standard Windows messages simulating keypresses for the text. So there’s no problem to use this macro in Notepad, for example.

To take this one step further, the key code trigger does not have to be based on qualifier keys such as Ctrl, Alt etc, it can also be a sequence of standard characters. Here’s one example of this:

; Today
::tdy::
	FormatTime, CurrentDateTime,, yyyy-MM-dd
	SendInput %CurrentDateTime%
	return

This creates a macro that inserts the current date every time I type the sequence “tdy” followed by a word delimiter (space, tab, dot, comma, etc). This simple macro is probably the one I use the most, it’s so incredibly useful to have easy access to the current date when taking notes, creating folders, etc.

I also have a few code snippets I use a lot when programming:

::isne::String.IsNullOrEmpty
::isnw::String.IsNullOrWhiteSpace
::sfm::String.Format 

This way I never feel tempted to write string comparisons such as if (s == null) { ... }, it’s just as easy to write if (String.IsNullOrEmpty(s) { ... } using my snippet. And this kind of snippet works even in Notepad :-)

This ability to replace character sequences is also very useful for correcting my common spelling errors:

::coh::och
::elelr::eller
::perosn::person
::teh::the

I try to detect common operations that I often perform that can be automated and try to write an AutoHotkey macro for them. An example of this is that I have noted I often have to write a valid Swedish social security number (personnummer) when testing applications I write. This can be a pain since the number has to end with a correct checksum digit, so I wrote a simple web service that creates a random SSN and returns it. This service can be called from AutoHotkey like this:

; Anropa web service för att skapa personnummer
::pnr::
	EnvGet tmpDir,TEMP
	UrlDownloadToFile http://kreverautils.azurewebsites.net/api/testdata/personnummer,%tmpDir%\random.pnr.txt
	FileRead, pnrRaw, %tmpDir%\random.pnr.txt
	StringReplace, pnrClean, pnrRaw, ", , All
	SendInput %pnrClean%
	return

This is really convenient, I just type “pnr” and it’s replaced with a valid SSN. This really lowers the mental barrier when testing applications where this data is required, resulting in better testing. (Mental barriers when testing applications are very interesting and is perhaps worth a separate blog post some time…)

Summing it up

The above examples absolutely just scratch the surface of what you can do with AutoHotkey, so why not give it a try? It’s free, so it’s just a matter of downloading it and start experimenting. Download it from its home page.

Final tip

I like to have all my computers share the same AuoHotkey setup so I have created my main macro file (general.ahk) in a DropBox folder. I also create a shortcut in my startup folder with this target string:

C:\Users\emila\Dropbox\Utils\AutoHotKey\general.ahk

Since AutoHotkey associates itself to the “.ahk” file extension, this is enough start start the script on startup. Any change I make to the macro is automatically propagated to all my computers.

Good luck with your automations!

/Emil

Using NDepend 4 to analyze code usage

This week I was assigned the task to analyze the usage of an ASMX web service we are planning to remove since it has a number of problems (which is another story) and switch to new and better written WCF services. As the service is rather large and has been around for years the first step was to analyze if it had methods that no client actually used anymore.

For this task I decided to use the brilliant code query functionality built into NDepend 4. I have briefly reviewed earlier versions of this tool on this blog but this time I thought an actual example of how to use it in a specific situation would be illuminating.

The first step was retrieve a list of the methods in the web service. To do that, I added an NDepend project to the web service solution. Se below for an example of the dialog used for this:

Attaching an NDepend project to a Visual Studio solution

After this NDepend performed an analysis of my solution, after which I was able to start querying my code using the CQLinq querying language. NDepend has for a long time had its SQL-like CQL (Code Querying Language) but for some reason I never got around to using it. NDepend 4 introduces CQLinq which is much nicer syntactically and has a good editor for writing code queries, including IntelliSense. For more info about CQLinq, see this introduction.

What I needed was a list of methods on my Web Service class. To retrieve this, I opened the “Queries and Rules Edit” window (Alt-Q) and typed:

from m in Methods
where m.ParentType.FullName == 
   "ActiveSolution.Lernia.Butler.EducationSI.Education" && m.IsPublic
select m

The CQLinq query window.

The results is displayed in the bottom pane. I exported the list to an Excel file for further processing.

The next step was to see which of the web service methods the different clients used, so I analyzed each client with NDepend. Note that I excluded their test projects from the NDepend analysis to make sure that no lingering old integration tests affected the results.

For each client I listed those methods of their respective web service proxy classes that they were actually calling. A query for that can look like this:

from m in Methods  
where m.ParentType.FullName == "ActiveSolution.Lernia.SFI.WinClientFacade.Butler_EducationSI.Education"  
&& m.IsPublic  
&& !m.Name.StartsWith("Begin") 
&& !m.Name.StartsWith("End") 
&& !m.Name.Contains("Completed") 
&& !m.Name.Contains("Async") 
&& m.NbMethodsCallingMe > 0 
select m

The ParentType is of course the proxy class that gets generated when adding web service references. For this type, I list all public methods (except the asynchronous helper methods that we don’t use anyway) that are used by at least one other method. The results were copied into the already mentioned Excel document and when all clients’ data was retrieved I was able to do some Excel magic to get this result:

The resulting Excel report listing the reference count for each method in the web service class.

The columns B through G contains ‘1’ of the respective client is calling it. Rows with a sum of zero in column H are not used by any client and can be safely removed. Mission accomplished.

This has absolutely just scratched scratched the surface of what can be done using CQLinq, and there is much more functionality in NDepend than just queries (the diagram tools, for example). It’s a great product for anyone that is seriously interested in code quality. And we all should be, right?

/Emil

Viewing assembly binding logs using fuslogvw.exe

Sometimes it’s very useful analyze how an application binds to referenced assemblies but this process is fairly hidden from us. However, Microsoft has given us a way to look into this process via the fuslogvw tool. This tool is not overly documented so this post describes how to install it on a computer that has neither Visual Studio nor the Windows SDK installed, such as a server.

Follow these steps:

Copy the following files from a computer with Visual Studio or the Windows SDK installed:

  • "C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\FUSLOGVW.exe"
  • "C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\NETFX 4.0 Tools\1033\flogvwrc.dll"

Put them into a folder on the target computer.

Create logging folder, e.g. c:\fuslog

Start FUSLOGVW.exe. Update the following settings:

  • Log all binds to disk
  • Enable Custom log path
  • Custom lag path = c:\fuslog\

Finally, enable assembly binding logging in the registry by setting the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Fusion\ForceLog value to 1.

Click Refresh in FUSLOGVW.exe and behold! A list of assembly binding events are displayed:

The individual events contain useful details:

/Emil

2014-11-10 Update: Removed the mention of editing the registry. I added that a little too quickly, fuslogvw.exe is supposed to do that stuff for us…

NDepend 3

In case you missed it, there’s a very powerful tool called NDepend that helps you analyze your code structure and detect potential problems very easily. A new version, NDepend 3, was released earlier this year and the biggest new feature is probably Visual Studio integration:

It still has all the code metrics you’ll probably ever need, a custom made code query language, diagrams (some of which are still a little difficult to read) and graphs to help with analysis. See my post about the previous version for some examples.

Here are a few live examples of using NDepend: http://www.ndepend.com/Features.aspx#Tour

If you’re into code analysis you should give NDepend a try. With this new version it’s more usable than before and it still has the same very powerful analysis engine under the hood. Check out http://www.ndepend.com for more info.

/Emil

NUnit with SQLite and .Net 4.0 Beta 2

SQLite and unit testing is a great combination for testing database operations without having to manage database files. You can simply create an in-memory database in your setup code and work with that instance. Perfect in combination with NHibernate, for example.

If you want to do this in the current .Net 4.0 beta you’re out of luck though, you’ll get an exception:

System.IO.FileLoadException: Mixed mode assembly is built against version 'v2.0.50727' of the runtime
and cannot be loaded in the 4.0 runtime without additional configuration information.

The solution is pointed out by Jomo Fisher. What you do is to include this snippet in the application config file:

<startup useLegacyV2RuntimeActivationPolicy="true">
  <supportedRuntime version="v4.0"/>
</startup>

When unit testing assemblies that references System.Data.SQLite.DLL then you have to put that snippet in NUnit’s config file (C:\Program Files\NUnit 2.5.2\bin\net-2.0\nunit.exe.config).

If you combine this with the tip in my post NUnit with Visual Studio 2010 Beta 2, you should insert the following

<startup useLegacyV2RuntimeActivationPolicy="true">
  <supportedRuntime version="v4.0"/>
  <requiredRuntime version="v4.0.20506" />
</startup>

plus

<loadFromRemoteSources enabled="true" />

under the runtime tag.

This works for me, hopefully it will for you as well.

/Emil

log4net configuration Xml Schema

I just found a very useful XSD for editing log4net configurations:

http://csharptest.net/?p=38

Just copy the schema file to C:\Program Files\Microsoft Visual Studio *\Xml\Schemas and add the correct namespace to the log4net element in your config file:

<log4net xsi:noNamespaceSchemaLocation="http://csharptest.net/downloads/schema/log4net.xsd" 
          xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

/Emil