SqlConnection woes on the Compact Framework over WiFi

Last night at the Toronto .NET Users Group talk I did on mobility, a gentleman had a question about directly connecting to an Enterprise SqlServer database from a Pocket PC using the Compact Framework. His users run on a shop floor and some times they lose their wifi signal. While he doesn't keep the SqlConnection open the whole time, it seems that when a wifi signal is back alive his application can't connect to the database using a SqlConnection.Open anymore - despite that he can still ping the server.

He was correctly doing a SqlConnection.Close in a finally clause around his data access, but I suspect that even though pooling is not supported on the compact framework, this is not doing a proper disconnect of the physical connection (a Sql Profiler session would probably tell you that for sure). So a using block in C# will ensure that the SqlConnection is disposed of immediately and will properly terminate the connection. So if between calls you lose and reaquire your wifi connection you'll be starting a bran new connection the next time. It's still a good idea to put all of your actual work in a try/catch/finally block with a SqlConnection Close in the finally.

using (SqlConnection cn = newSqlConnection(CONNECTION_STRING ))
{
 try
  {
   cn.Open();
   //do some work with the connection
  }
  catch (SqlException ex)
  {
   //error handling
  }
  finally 
  {
   cn.Close();
  }
}

Cookieless Sessions and Security

In a previous blog, I pointed out that Microsoft had created an HttpModule that mitigated the ASP.NET cannonicalization issue that was first described a couple of weeks ago. In one of the comments, Amir asked about the security issues surrounding the use of cookieless sessions.  Specifically, he was wondering if ASP.NET could tell if a request containing a cookieless session component was coming from a different browser instance or even a different location.  In brief, the answer is “No“.

When used as part of a plain text web site (i.e. no SSL), cookieless sessions are not secure at all. The session id is placed into the URL in every local link on a page.  The form of the URLs would not be as follows:  http://domain.com/(sessionid)/Page.aspx. This session id is not automatically tied to a specific browser instance or even the IP address of the initial request.  There is no way that I'm aware of to tie the request to a browser (other then using cookies ;). And while a session can be associated to an IP address, this requires some additional work on your part, in the form of an HTTP Module.  The association between session Id and IP address could be done by generating a hash of the IP address for the request (Request.ServerVariables ["REMOTE_ADDR"]) and the session id and using the resulting value to access the session variables.  However, this solution doesn't take into consideration the case where people are behind a proxy that causes the IP address to change from request to request. For this, you might want to just use the first two components of the IP address, since these are not normally varied by the proxy. But if the spoofer is on the same subnet...  As you can see, all in all, this is a difficult problem to solve.

I should probably mention that cookie-based session suffer from the same problem.  There is nothing inherent in how cookied sessions work that makes them safer than their cookieless counterparts.  A hacker can easily create a request that includes a spoofed cookies containing a hijacked session identifier.  But at least a cookie-based session solution can use SSL to encrypt the entire request.  Since the session is not included in the URL directly, it is not easily accessible if the request/response gets hijacked.

So to summarize, while cookieless sessions are nice in theory, in practice they really should only be used on sites where every page is accessed through SSL.  And the implementation should be customized to ensure that a portion of the IP address is used to verify the originating IP address of subsequent requests corresponds to the original request.

Get lost, have fun.

Ok this is too cool. Bell Mobility has this Location Service called MyFinder. Using Cellular triangulation they can figure out pretty close to where you are. It costs 25cents to locate yourself on any Bell Mobility Cell phone with a browser (plus air time ). You can do things like find the nearest gas station to where you are right now (or ATM, restaurant, etc.) and then using Map Point will give you driving/walking directions. I tried out this service in Waterloo and it found me within a block of where I actually was. Bell says within 150m. In my example that would be pretty close. So your mileage might vary (no pun intended).

If you are lucky enough to have Sanyo 8100 or an Audiovox 8450/8455 you can also use MapMe service which will download a map to your device and your location plotted on it - along with nearby places. This appears from the screenshot to be something along the lines of a Pocket Streets derivative. It costs $5 (one time) for the software, and again 25 cents to be located per use plus air time (unless you have the Mobile Browser Bundle).

Bell Mobility also has a few multiplayer games that you play against people nearby. Less cool. Get a life.

Perhaps the most (potentially) practical is a #TAXI phone number that connects you with the first available taxi based on your location. It uses many cab companies. It sounds like they might find you a taxi parked around the corner but I don't think it quite works that way. It found my city and routed me to the phone number of the nearest taxi company. It also sent me a text message with their direct phone number. A $1.25 a call? The taxi company should pay for that - no?

www.bell.ca/finderservices

You can develop applications using this service too. If you are lucky to live in Canada and be a Bell Mobility Subscriber - then you are in luck (as I am) since Bell is now providing realtime MS MapPoint Location Service. It almost makes up for being on a CDMA network and having that limited phone selection problem.

Microsoft MapPoint Location Server is a new component of Microsoft MapPoint Web Service. You will install this component (along with SQL Server) in your enterprise and using the MapPoint Web Service and location data from your provider you can get an inventory of all the people in your company and their locations. For doing dispatch type of applications, 150 meters is probably close enough. General Information on both here (http://www.microsoft.com/mappoint)

MSDN Universal, Enterprise, and Professional subscribers are eligible for a free subscription to MapPoint Web Service (a requirement of using MLS)(https://s.microsoft.com/mappoint/msdn/msdnspec.aspx)

Bell Mobility Developers Web site for location based services (http://developer.bellmobility.ca/lbs/)

 

 

CTTDNUG UG Tomorrow Night: Building Pocket PC Applications with the Compact Framework and SQL CE

As part of the continuing MSDN User Group Tour, I'll be speaking at the Canadian Technology Triangle .NET User Group in Waterloo on Thursday October 7th (tomorrow night). There is a new location for the meeting at Agfa (formerly Mitra) in Waterloo. All the details are here.

Correction: Adam Gallant is going to be the speaker at this event tomorrow night. Sorry for the confusion. It should be a good talk.

Things are busy....

at ObjectSharp these days - more than usual. I attribute this to a recent upsurge in both .NET and BizTalk adoption - people doing real projects, running into real problems and needing real help from the experienced. We've been steadily hiring new consultants all year - but by one or two at a time and it's time to put the call out again. Are you a seasoned developer/architect? Do you have real world experience with .NET, BizTalk, or Sharepoint. Do you consider yourself an excellent communicator? Interested in teaching as well? Then we (or more importantly, our customers) need your help. You live in Toronto or Vancouver right?

Send your resume to careers@objectsharp.com.

 

IIS 6.0 Isolation Mode and ASP.NET worker process identity

A client ran into an intriguing problem the other day.  The application under development has a number of web services the get deployed onto one server or another as different versions are released for client testing.  Underneath the services, LDAP is used to store roles, preferences and other flotsam and jetsom as needed.  In order to gain access to this information, the IIS_WPG group is given read access to LDAP. Relatively straightforward.

The problem arose when the current version was deployed onto a new machine.  Instead of being able to connect to LDAP, an exception was being thrown.  The ASPNET worker process didn't have access to LDAP.  We were able to connect to the LDAP server, but attempts to negotiate access were being denied.

Four heads are now being scratched.  We removed and add the IIS_WPG user to LDAP.  We do a couple of IISResets to make sure the security context isn't being cached.  Checked the SID that is included as the DN for the IIS_WPG group to make sure something hasn't been installed correctly.  Nothing.  We then gave Everyone access to LDAP.  The service started working again.  So we knew it had something to do with permissions.

In a fit of, well, despiration, we gave permission for the ASPNET user to access LDAP.  Wouldn't you know it.  Things worked again.  But this was unexpected.  One of the things that changed with IIS 6.0 was that ASPNET was no longer the identity under which the worker process runs.  Wasn't it?

As it turns out, the real answer is "it depends".  If you install IIS 6.0 freshly, then IIS_WPG is the group to which the permissions you'd like ASP.NET to have should be assigned.  That is to say, that IIS 6.0 runs in Worker Process Isolation Mode. However, if (and this is the 'if' that caught us) you upgrade from IIS 5.0 to 6.0, the ASPNET user is still the security context for the ASP.NET process.  This can be modified by changing the isolation mode in IIS.  Quite easy, if you know how.  The trick, as we found out, was knowing that we even had to.

 

Working with Default Rows

Occasionally, a situation arises where the contents of a particular data table within a Dataset *might* originate from more than one source.  For example, a client of ours has a reporting system where there is a set of default columns for each report type.  When a user creates a new report (stored in the Report table in the Dataset), the default columns (the ReportColumns table) are loaded from a DEFAULT_COLUMN table.  If any change is made to the information in ReportColumns (like changing a caption, for example), all of the column information is copied and saved in a REPORT_COLUMNS table.  However, if no change is made, no information needs to be saved.  This allows changes in the default columns to be automatically propagated to any report that hasn't been 'customized'.

All of this functionality is easy to add to the Load method of an entity based on this DataSet.  However, what happens when a Report that uses the default columns is deleted.  Normal processing would have the related ReportColumn rows deleted, followed by the corresponding Report row.  However, when the delete is performed on each of the ReportColumn rows, a DBConcurrencyException is thrown.  The message with this exception is as follows:

Concurrency violation: The DeleteCommand affected 0 records.

With a little bit of thought (actually, slight more than a little the first time this error was encountered :), the reason for the error became apparent.  The records in the ReportColumns table didn't exist in the database.  Nor should they, since they were originally populated from DEFAULT_COLUMNS, not REPORT_COLUMNS.  However, the DeleteCommand was trying to remove them from REPORT_COLUMNS.  And when the DataAdapter class attempts to delete a record and doesn't see any rows being changed, it assumes that a concurrency violation has taken place. 

The solution was to take a little more control of the deletion process.

dataAdapter.ContinueUpdateOnError = true;
dataAdapter.Update(reportsData);

if (reportsData.ReportColumn.HasErrors)
{
    DataRow[] drs = reportsData.ReportColumn.GetErrors();
    foreach (DataRow dr in drs)
        if (dr.RowError.Substring(21) == "Concurrency violation")
            reportsData.ReportColumn.RemoveReportColumnRow((ReportsData.ReportColumnRow)dr);
    reportsData.ReportColumn.AcceptChanges();

// If the dataset still has errors, then an exception needs to be thrown

    if (reportsData.ReportColumn.HasErrors)
       
throw new DataException("An exception was raised while updating the ReportColumn data table: " +
           
reportsData.ReportColumn.GetErrors()[0].RowError);
}

The ContinueUpdateOnError property is used to stop the aborting of the updates on a exception.  Once this is done, we check to see if the data table has any errors.  If so, we make sure that every row that has a concurrency violation is removed from the Dataset.  The AcceptChanges method call is required to update the HasErrors flag, if all of the errors had been eliminated in this manner.

Now is this a perfect solution?  No.  It is possible that a legitimate concurrency violation is ignored.  From the point of view of this application, it is considered an acceptable risk given the type of data being used.  If you wanted to trap the 'real' concurrency problems, you could select against the REPORT_COLUMNS table using the primary key (instead of all of the original values) when a concurrency violation is detected. If a record is found, then the concurrency problem is real.  Otherwise the exception can be ignored.

Oshawa .NET: Building Mobile Applications

I'm doing a talk at the East of GTA .NET users group tonight in Oshawa. This is the same MSDN User Group tour event sweeping across Canada. I'll be talking about some of the limitations of the Compact Framework and SqlCE. Should be fun - hope to see you there.

Registration Links and slides (afterwards) can be found here.

VS.NET IDE Teaching an old Dog new Tricks

I love hanging out with new VS.NET developers. It's enlightening to hear the troubles the face and their new found energy to solve them. I have to blog more about these - but in general, there are often things that I do out of habit in the IDE or things that I live with because I'm too lazy (or tired or busy) to find a way around them.

Two new tricks were brought to my attention by an associate of mine.

“How do I find all of the references to a class or usages of a member?” or “When I right click on a class and select Go To Reference, it goes to the first one it finds. How do I go to the next one?”.

The best answer is CTRL+SHIFT+1 which will jump you to the next reference. CTRL+SHIFT+2 will take you to the next reference.

I couldn't find this short cut anywhere in the menu's. The complete list of short cuts can be found here.

Which begged another question. Can I put short cuts or favorites in the IDE? Indeed, under View>Other Windows there is a favorites window which is your machines favorites. This is great to have docked right next to your Dynamic Help (if you have it turned on).

A similar question was “How do I find all of the descendent's of a class or implementations of an interface?”. I always using the online help for that, which doesn't help for your own code. One of the solutions I found (and maybe there is a better one) is to use Find Symbol under the Find and Replace menu (ALT-F12).

 

GDI+ Security Vulnerability

There is a new critical security vulnerability that affects a wide range of software that can't be easily patched through Windows Update. The vulnerability lies inside of GDI+ and can allow a maliciously formed JPEG image file to create a buffer overrun and inject malicious code - even through a web page's graphics...no scripting or anything.

Windows Update will go ahead and update major components but you also need to go to the Office Update site as well as update a bunch of other software you might have on your machine.

In particular for developers, the .NET Framework (pre-latest service pack) and even Visual Studio.NET 2003 and 2002 are affected and need to be separately patched.

The full bulletin with links for all the various patches are available here. http://www.microsoft.com/technet/security/bulletin/MS04-028.mspx

If you go to Windows Update it will also provide you with a GDI+ Detection tool that will scan your hard drive looking for affected components. I strongly you recommend everybody jump all over this one quickly.