Open With... the editor of your choice.

I'm on an VS IDE high lately. So to continue in this arena. Here is another tip my students and audiences seem to like.

I'm sure there is someone out there who has an XML file with a non standard extension, and they want to be able to use the XML editor in Visual Studio. But when you open the file there is no colour coding or any of the other XML editor features. This is because you are not in the XML editor. This is a nice little trick, I hope someone finds it useful.

I found this the first time I set up a Visual Studio project to hold all my NANT Scripts. NANT creates XML files with a .build extension. When you edit these XML files in Visual Studio they are not colour coded by default, because Visual Studio does not know it's an XML file.

Here is how to rectify the situation. When you open the file in Visual Studio (File->Open->File...). You will notice a drop down arrow on the open button.


Click the drop down arrow and select Open With... You should get this dialog.


Select the type of editor you want to open the file with. While you are here click on the Set as Default button. From now on whenever you open a file with that extension it will be opened in the editor you selected. In my example files with a .build selection always open in the XML editor now.

 

How do you feel about the VS.NET Query Designer

The VS Data Team wants your input. Head over here. (BTW, don't you love these surveys? They're the best and tell me that MS really cares about what we think).

Layered Design and DataSets for Dummies

Scott Hanselman does a nice 30 second intro into layered design. If any of this is new to you, run quickly to read this.

Scott does a quick bash at Datasets (although doesn't say why) and in my new role as DataSet boy I have to disagree with him and evangelize how simple datasets can make a lot of the code written by the typical programmer: CRUD stuff for example. He even mentions “Adapter“ in describing a data access layer - come on, use the DataAdapter - don't be afraid. In general, if anybody tells you to never do something, you need to question that a bit and dig into the reasons why a technology exists. Of course things may just end up being rude and the answer is indeed never - but always try and get the why.

We've been running a developer contest at the end of some of our training courses (the big 3 week immersion ones). The competition has developers build a solution build on a services oriented architecture which includes smart client, web services, enterprise services/com+ and of course data access. It's only a 1 day event, but the teams are built up of 5-6 people each. Inevitably, if one team decides to use datasets/dataadapters and the other team doesn't, the team that choose the dataset wins. This competition isn't judged or skewed for datasets by any means. But inevitably this is the thing that gives the other team more time to work on the interesting pieces of the application (like business logic: features and functions).

I over heard Harry Pierson tell a customer last week that they shouldn't use datasets in a web service because they aren't compatible with non .NET platforms. This isn't true. A dataset is just XML when you return it out of a dataset. And you probably more control over the format that is generated via the XSD than most people realize. If you want a child table nested, no problem. You want attributes instead of elements, no problem. You want some columns elements and others attributes, no problem. You want/don't want embedded schema, no problem. You don't want the diffgram, no problem. Somebody in the J2EE world has actually gone to the extent of creating a similar type of base object in Java that can deserialize the dataset in most of it's glory. (Link to come - can't find it right now).

In February I posted a “Benefits of Datasets vs. Custom Entities“ which has generated some excellent feedback. It's still in my plans to write the opposite article - when Customer Entities are better than Datasets but I'm still looking for the best template or example entity. Everyone somebody has sent me to date is somewhat lacking. To be fair, I end up always comparing them to a dataset. The things typically missing out of a custom entity are the ability to deal with Null values and the ability to track original values for the purposes of optimistic concurrency. The answer to the question of “When to use a Custom Entity over a Dataset?“ is of course when you don't need all the stuff provided for you by a dataset. So is that typically when you don't care about Null Values or Optimistic Concurrency? Perhaps. But I know there is more to it than that.

I will say there is some crummy binary serialization in the dataset (it's really XML). This is really a problem if you are doing some custom serialization or need to do some .NET remoting. But you can change the way they are serialized (and indeed it's changed in Whidbey). There are some good examples here, here, here, here, here and here.

I'm working on an article making the cases for the custom entity, but in the meantime, datasets represent a good design pattern for entities that is easy and quick to implement by the average developer - and scalable too.

Incremental Search

Barry found this and showed me some time ago. I had forgot about it and stumbled across it today. I keep forgetting it's there. I'm hoping by Blogging it I will remember.

When you are editing code in Visual Studio hit CTRL-I. You will see this little icon appear,  this is Incremental Search.

Once in this mode you can just start typing and it will find text that matches what you are typing as you type.

If you have typed the whole word but search has not found the one you want, hit F3 to jump to the next occurrence.

 

TechEd (Day 3): Hands On Lab Manuals downloads available to the public

No need to have a TechEd commnet password. You can download ALL the pdf's for the plethora of topics. Some good stuff to see how the newly announced stuff (Team System, etc.) works.

Update These links are broken, give this a try: http://www.msteched.com/TechEdLabManuals.aspx

How does dynamic help work?

Did you ever wonder how dynamic help works? The Visual Studio IDE continuously emits keywords and attributes depending on what you are doing in the IDE. Lets start by turning on debugging in Visual Studio. I don't mean the kind of debugging you do to your code. It's easier if I just show you. Follow these steps. Don't worry it's harmless.

  1. Close Visual Studio .NET if it is not already closed.
  2. Open the Windows Registry Editor (RegEdit).
  3. Go to the following key:
    HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\7.0\Dynamic Help
  4. Add a new string value of "Display Debug Output in Retail" with a value of "YES."
  5. Start Visual Studio .NET.
  6. Open the Dynamic Help

Now open a solution and start working, watch the dynamic help window you will see the keywords displayed in the Dynamic help window as you manipulate your code in Visual Studio.

For example if I were to click on a windows form in VB.NET I would see the following in my Dynamic help window.

--- Keywords - priority: 500 ---
(F1 Kwd) Keyword=Designer_System.Windows.Forms.Design.FormDocumentDesigner
(Kwd) Keyword=System.Windows.Forms.Form
--- Keywords - priority: 100 ---
(Kwd) KEYWORD=VS.Ambient
--- Attributes ---
Devlang=VB
Item=Project
Item=vb
Product=VB
Product=VS
Project=Exe
ProjType=LocalProj
Selection=None
ShellMode=Design
Solution=Any
Solution=Multiple
SourceControl=FALSE
SubType=Form
UniqueFiles=resx
UniqueFiles=vb

What you are seeing are the keywords and attributes emitted by Visual Studio. This is how Dynamic help works. You can add your own custom help that displays different help links based on keywords emitted from the IDE.

Ok, so now that I know how to get the keywords, how do I add my own custom help? This is a Blog not an article so I'll give you the abridged version, you can look up the rest yourself.

Create a file called CustomHelp.xml and place it in C:\Program Files\Microsoft Visual Studio .NET 2003\Common7\IDE\HTML\XMLLinks\1033

Here is what a sample CustomHelp.xml file could look like. Some nodes to take note of:

  • LinkGroup defines a heading in the help to group the links
  • Keywords/Kitem define which keywords trigger this help to display
  • Links/Litem the URL for the help file, the link group it belongs under, and the text to display

The above CustomHelp.xml file will generate dynamic help that looks like this. Well the last group called ObjectSharp Coding Standards

The keyword I used in the sample above (Ambient) is a special keyword. It is always present. Therefore if you want a help file to always be accessible use the keyword ambient in your CustomHelp.xml.

As I'm sure you can tell I have only scratched the surface on this subject. There is plenty more information in the MSDN library.

The Immediate Window

I notice a lot of developers do not use the Immediate window. If you are one of those users,  do yourself a favour and check it out. It's terribly useful. For those of you who are not familiar with it here are some highlights to get you started.

The window I'm talking about in VS 2003 is the Command Window. This window can be used to either issue commands or to debug and evaluate expressions in Visual Studio .net. To open this window, press CTRL+ALT+A, or click on the menu View->OtherWindows->Command Window

For the record I believe I heard these two windows will be seperated in Whidbey.

Below is a quick description of the two modes.

Mode

Title Bar Displays Description
Command Mode Command Window Allows the developer to perform any command available in Visual Studio via a command line interface.
Immediate Mode Command Window - Immediate Used when debugging an application for evaluating expressions, print variable values, or execute statements. It can also be used to change the value of a variable during debug or even to execute a function or statement.

How to switch between the two modes?

  • When you are in command mode you can enter the command >immed to switch to immediate mode.
  • When you are in Immediate mode you can switch to command mode by typing >cmd
  • If you wanted to execute one command in immediate mode prefix it with a > ie: >alias

Immediate Window:

This is the mode I use most. It's like a command line quickwatch, but it has intellisense. To evaluate an expression you must prefix it with a ?

  • ?this.Order.OrderTable(0).itemarray
  • ?this.Order.OrderTable(0).CustomerId
  • ?this.Order.OrderTable.count

In previous versions of the Immediate window the UP and DOWN ARROW keys move the cursor to previous commands. In VS .Net they allow you to scroll through previously issued commands. If you scroll to a command highlight it and hit enter it will be copied as the next command to be executed.
Once in a project, the Command window can be opened in Immediate mode by pressing CTRL+ALT+I, or click on the menu Debug->Windows->Immediate.

One more quick tip. You can copy your command from the immediate window into the watch window. So in other words get it right with the help of intellisence then paste it into the watch window so you can see it permanently.

The Command Window:

Just to give fair time to both modes. The Command Window makes full use of intellisense to help you find and execute a command within VS.

You can also create your own alias for any command for example: You can type the following to save all the files in your solution.

 

However, if you used this command a lot you could create a shorter alias of this command like fs.
To create an alias enter the following into the command window.

>

From now on you can just type fs to perform a file.Saveall

To see a complete list of aliased commands just type alias in the Command Window and it will list all the aliased commands.

 

TechEd (Day 1): Balmer's Keynote & Announcing VS 2005 Team System

It's official. I'll post more thoughts and analysis about this as time permits, but, things you should know.

  • Microsoft now has a new Team version of Visual Studio to be delivered “Next Year“ according to Balmer.
  • new source control - more details to follow.
  • Project Management - so dev's will be able to see “Work Items“ in their IDE. There is also supposed to be a sharepoint portal of some kind that dev's & pm's can go to see a dashboard view of a project, milestone's, etc. integrated with MS Project Server.
  • Unit Testing - yes, a very NUnitish thing built right into visual Studio.
  • Code Coverage - yes in the editor you can see what code was executed and what was not.
  • Static Code Analysis - a la fxCop integrated right inside of visual studio.
  • Check in Source control process policy, so a manager type can say “if you check in something, all tests must pass, all static analysis rules must pass, and your code coverage must be 100%“.
  • Also showed was some Load testing stuff that is going to be better than Application Center Test - more on that later.

Of course whitehorse class modeling & SOA designer were showed quickly. Nothing new to announce yet on that front that wasn't covered at PDC....although the guy doing the demo kept saying “Services Oriented APPLICATION” designer. Is this new? Is he changing the acronym from Architecture?

TechEd (Day 0): BOF36: Integrating Unit Testing Tools and Practices Into the Software Development LifeCycle

This BOF went pretty well and a huge thanks to Jim Newkirk for assisting in the delivery. He's a real authority on the practices around NUnit and a good guy to have a round. If you buy his new book on Test Driven Development with Microsoft .NET onsite at TechEd, you can probably catch him at the MS Pavillion to sign a copy.

Some interesting points discussed:

  • Using Unit Tests to drive “example code“ for a framework or class library would be a nice to have.
  • While Code Coverage statistics may satisfy external parties that we've tested what we've developed, percentages are not an accurate measure of code quality.
  • If you write your tests after you do you coding, you already have too much information about your classes that negatively affects how you test.
  • Testing first can really influence (positively) the design of your classes.
  • Developers will work aggressively against source-code check in policies that stipulate a % of code as been covered in unit tests, and that the tests pass, and that they pass static code analysis.
  • It's difficult to test User Interface code, and for a bunch of reason's, not a really good idea or worthwhile investment because the only person who can see your application from the outside in, is through the eyes of a specific user - and you'll never be able to approach that.
  • At the end we also got into some of the difficulties of testing against a database and a bit about Mock objects. That would probably be a good bof on it's own.

Jim might have more comments, but the general feeling I got was that people still need more education about automating unit tests and that not a lot of people are doing it today, let alone Test First. Jim also mentioned that he didn't think it was possible to lecture to somebody and convince them about Test First, but more that it was something that they just really needed to see for themselves. I agree.

TechEd (Day -1): Pieces of ADO.NET 2.0 not going to make Whidbey

Highlights in this exchange of newsgroup posts...

  • Paging is cut
  • Server cursors cut (SqlResultSet class)
  • Async Open is cut, but not excute
  • SqlDataTable class cut (Good - that was a bad design pattern to propogate)
  • Command sets are cut, but you can still do batch DataAdapter updates,

http://communities.microsoft.com/newsgroups/default.asp?icp=whidbey&slcid=us

More to come, stay tuned.