Concatinating string in a batch file

Batch file? Yes you heard me, batch file, you know .cmd.

This is an odd post for me, however I needed to do this recently and don’t want to forget how I did it. Smile

 

When automating builds I often take advantage of invoke process. It’s a nice way to keep my build process generic and call out to a process that may change over time or is specific to branch I am building. It allows me to keep some of the process specific to the source code, by storing the batch file with the source code. This also allows me to version the batch file if it changes over time, and ensure it available to the build server without installing it on the build server.  None of this has anything to do with my blog post. Whatever the reason you may be calling out to a batch file using invoke process read on.

Have you ever had trouble passing variables like DropLocation or SourcesDirectory. If they have spaces in them you will need to put them in quotes. Lets say I want to pass the DropLocation to a batch file.

The arguments property of my invoke process would look like this. """" + DropLocation + """"

Notice the double quotes around the variable, to ensure they are passed to the batch file as one argument.

“\\My File Server\Drops\BuildDefinition_1.3.4636.1”

Without the quotes in this example my batch file would have received 3 arguments

  1. \\My
  2. File
  3. Server\Drops\BuildDefinition_1.3.4636.1

Therefore I need the Quotes.

In my batch file I want to access files in a subfolder of my DropLocation. So I need to concatenate the path passed in with another folder. Ultimately I want the location

\\My File Server\Drops\BuildDefinition_1.3.4636.1\MyFolder

The problem is with the double quote at the end of the argument this %1\MyFolder would end up being this “\\My File Server\Drops\BuildDefinition_1.3.4636.1”\MyFolder

Here is how to fix that problem.

First create a variable to hold your argument

set DROPLOCATION=%1

Then when you go to use the variable use a little string manipulation. Specifically ( :~f,e ) where f = the number of characters to strip from the start of the argument and e the number of characters to strip from the end.

Therefore to get ride of the double quotes at the end of our argument just remove the last character.

Like this %DROPLOCATION:~0,-1%\MyFolder"

The above syntax will result in this “\\Server\Drops\BuildDefinition_1.3.4636.1\MyFolder” making it possible to add my folder in the batch file instead of in the build process.

What's New with SQL Server 2012?

Like any newly released version of software, Microsoft SQL Server 2012 brings a host of significant changes to Microsoft’s enterprise data platform. And, in this case, the changes are not only in the features but in the editions that are being offered as well. And there is a new certification model .And many performance, business intelligence and development enhancements. So take a look at the list of features and see which ones matter to you.

Data Quality Services – Clean data is happy data. And Data Quality Services (DQS) is a knowledge-based tool that helps ensure your databases contain high-quality, correct data. DQS performs data cleansing, which can modify or remove incorrect data. It also does data matching to identify duplicate data and profiling that intelligently analyzes data from different sources. Keeping your data happy has never been easier.

Changes to T-SQL – There are a number of T-SQL enhancements that have been introduced in SQL Server 2012. My personal favorite is TRY_CONVERT for data conversions. But that’s a personal preference. Others might be partial to OFFSET and FETCH (for data paging), a new FORMAT() function (data formatting made easier), and a new THROW operator (exception handing .NET style).

SQL Server Data Tools - SQL Server Data Tools uses the Visual Studio 2010 shell, and enables model-driven database development. This is on top of more traditional T-SQL and SQLCLR development and debugging. From a connectivity perspective, SQL Server Data Tools can connect to SQL Server 2005 and later as well as to SQL Azure. That last features makes it particularly useful, as I have found myself putting more and more database into the cloud.

Windows Server Core Support - Windows Server Core is designed to support server-based, infrastructure applications. We’re talking about applications such as SQL Server that provide back-end services but don’t need a GUI on the same server. Previously, SQL Server could not be run on Windows Server Core. This seemed to be ironic (and not in the Alanis Morrisette way) . SQL Server 2012′s support for Windows Server Core allows you to have a leaner and more efficient SQL Server installations, reduces potential attack surface and minimizes the need to apply non-relevant updates and patches.

These are not all of the features, but they are the ones that spring to mind for me. Included in SQL Server 2012 are features such as new SKUs (Enterprise, Business Intelligence, Standard and Express), contained database, a columnar index used to support high performance data warehousing scenarios, Power View and AlwaysOn Availability Groups.

Want to learn more? View the new SQL Server 2012 courses (and SQL Server 2008 courses) that are being offered at ObjectSharp right now!

Technorati Tags:

Revealing Module Pattern in CoffeeScript

My friend Colin introduced me to CoffeeScript last week through Mindscape’s Web Workbench.  As I started to learn CoffeeScript, the first thing I wanted to do is to hand translate some of my JavaScript files to see how the CoffeeScript compiler would convert them to JavaScript so I would end up with the functionally same code in the end.

I use the Revealing Module Pattern quite a bit. I found an implementation written by Burk Holland and in the comments of Burk’s post Steve Howell and Elijah Manor provided additional implementations.  Although, Burk’s example came close to what I was looking for in simplicity and cleanliness, I felt that there could be some improvements:

  • It defined the module in the global namespace.  I always create my modules in a namespace I define for the my project to further protect against naming collisions.
  • I didn’t like how the module ended with the definition of a variable to contain the  public functions.

I put forward the following:

I use the existential operator to use my project’s namespace if already defined or to create a new one.  Although, the || operator does produce nicer JavaScript output.  I’m sitting on the fence to which way to go.

I create an object as the last expression in the function that will be returned.  This skips the creation of the temporary variable that is never referenced. 

I’m pretty happy with the JavaScript that is produced.  It very closely matches how I implement the Revealing Module pattern in JavaScript, and it upholds the simplicity and cleanliness of CoffeeScript.

Serious Test Case Design

 

I am doing part of a talk at the Microsoft Canada Office on Thursday September 13th at 6:00pm.

Here is the blurb:

Testing with Microsoft Test Manager

Coded UI tests provide a way to create fully automated tests to validate the functionality and behaviour of your application’s user interface. We’ll cover some of the ways you can use these methods of testing with TFS 2012 to increase your teams productivity. Accelerate your testing with Fast Forward for Manual Testing to quickly re-run test steps or even entire test cases in the future.

Come out it should be fun.

Cloud with a Chance of…Doesn’t Matter

In the past week, I have seen a couple of articles that discuss the lack of awareness of the Cloud in the general public. The following, from the Globe and Mail summarizes quite nicely.

“While cloud computing is growing increasingly pervasive, a new survey shows how many people are still cloudy in their thinking about the technology.” - http://www.theglobeandmail.com/report-on-business/small-business/sb-tools/small-business-briefing/cloudy-thinking-about-cloud-computing/article4504986/

The survey includes tidbits like 54% of people don’t think they use cloud computer (only 5% don’t), only 16% identify it correctly and (this one is my favorite) 51% believe that stormy weather can interfere with cloud computer.

(As an aside, I just got back from Punta Cana, where the Internet (and thus cloud computing) was turned off for two days while Tropical Storm Isaac passed through. Pretty certain that’s stormy weather interfering. :))

My comment about this state of affairs is: Who Cares?

What percentage of people have a working knowledge of the internal combustion engine? And yet a majority of people are quite able to drive without this knowledge. How many people have even the most basic understanding of how electricity is generated? And yet they don’t have a problem turning on a light.

Those of us in technology seem to think that it’s important to have others understand what we do. Perhaps it’s a need to appear smart. Perhaps we’re looking for acceptance after spending high school being given wedgies and swirlies. Doesn’t matter. I no more expect the average user of the technology I create to know how it works than I do my mother. And you shouldn’t either.

It should be completely transparent to the user where we put their information. The applications that we create should seamless transition between local storage, on-premise storage and the ‘cloud’. The user should only be aware of this when they use their phone to access the Word document they were writing before they left the office. Actually, I’m wrong. They shouldn’t care even then.

And that’s how you should be building your applications. Seamless integration between the various storage options. This isn’t necessarily the easiest choice for developer. Seamless == more work. But tools like the the Windows Azure Mobile Services can help. But don’t let the user know…they don’t care. They shouldn’t. All of their data should just be there. Like electricity

SharePoint 2013 Prerequisites Install Error...

While installing SharePoint 2013 prerequisites on Windows Server 2012 and SQL Server 2012, I have received the error "There was an error installing the prerequisites..." After checking out the logs (under %TEMP%\prerequisiteinstaller.<date>.<time>.log), you quickly learn that prerequisite install failed because of Microsoft SQL Server 2008 R2 SP1 Native Client. To bypass this problem, you can manually download Microsoft SQL Server 2008 R2 SP1 Native Client from http://download.microsoft.com/download/9/1/3/9138773A-505D-43E2-AC08-9A77E1E0490B/1033/x64/sqlncli.msi and install it. After you manually download Microsoft SQL Server 2008 R2 SP1 Native Client, go ahead and restart SharePoint 2013 prerequisite installer. Now SharePoint 2013 prerequisites should install successfully. J

Change Windows 8 Product Key After Install

After I've installed Windows 8 RTM, I tried to activate it as good folks at Microsoft are telling you too. When I clicked on Activate button, Windows activation failed, which of course made sense because I have not entered a product key yet. But, for some reason, there was no place to enter a product key under System properties. Or, at least I did not see it. Luckily, the good old Command Prompt and slmgr.vbs tools came to rescue. Just follow these steps to add/change product key using Command Prompt and slmgr.vbs:

  • Launch Command Prompt as an Administrator.
  • At the command prompt, type in "slmgr.vbs -ipk <insert your product key here>" and click Enter
  • To activate windows, type in "slmgr.vbs -ato" and click Enter.

That's all J

The Legofication Of Business

I love Lego.

To be fair, the number of people who don’t fall into that category is probably fairly small. There is nothing like the joy of taking the slightly differently shaped blocks and creating something bigger and better. And I’m not a big fan of all of the custom kits either.If a piece only has one purpose (like as the nose for an X-wing fighter), then it’s not for me.

I also love Star Trek. Well, not love, but greatly appreciate and enjoy the various forms over the years. And I have referenced Star Trek in various presentations, not to establish my geek cred (of which I have very little), but because of  how software works ‘in the future’.

And yes, Lego and Star Trek are related in this way.

The key to Lego block is the simple and consistent interface. Doesn’t matter what the shape of the block is, the fact that every block has the same interface allows them to be connected. And it is through the various connections that a much bigger whole can be created.

Star Trek takes the Lego idea and applies it to software. Every wonder how Geordi and Data were so quickly able to create new and complex software? Because all of the different components had the same interface. Or at least similar enough interfaces so that the components could communicate with one another. And connected software allows you to create a much bigger whole.

Now let’s move back to the here and now. What’s missing from our current software environment that would prevent Geordi from creating the application that saves the Enterprise? Two things, really. The lack of a standard set of interfaces and the inability of most software functionality to be ‘connected’. And this is the next phase in software development.

If you’re a company that provides services to others, then you need to think about enabling access to your service from the cloud. Want to allow people to easily buy your services or products? Give them an interface on the Web that allows them to do so. Not a Web page, but an API. Have some information that others might find useful? Give them an interface to access it. Create the Lego blocks that I was talking about earlier. Find standard interfaces for the type of data/service you offer and expose them on top of your services. In other works, provide Lego blocks for others.

One of the benefits of doing so is that you let others build out functionality based on your services. If your service or data is compelling enough, others will build your front-end for you. You have already seen this happen with a number of the social sites that are out there. People combine information from Foursquare, Twitter, Facebook, Google+, etc. to create interesting apps for others to use. The engagement level of people with apps that run on their phones are high and likely to move higher. Finding ways to integrate your service/data with that ecosystem can only be beneficial.

So what’s the downside? Well, you have to implement and/or design the interface. Technical yes, but not beyond the scope of what most companies can do. And you need to provide the infrastructure for surfacing your API. This is where virtualization comes into play. I’m a fan of Azure and the new functionality it offers, but speaking generically, virtualize where it makes the most sense. If you’re a Microsoft shop, I believe you’ll find the biggest bang for your efforts with Azure.

But the technology is not the key here…it’s the concept. Look at the products you offer to your clients. Find ways to expose those products to the Internet. Be creative. The payoff for your efforts have the potential to be significant. But more importantly, you take the first step towards what will be the development and integration paradigm for the next decade…Lego.

Microsoft Changes The World – Part 4

Yesterday was another Microsoft announcement day. Only this time, it was the preview for the next version of Office (you’ll hear it called Office 15 or Office 2013). I was half surprised they didn’t include this in the Worldwide Partner Conference last week. It was certainly suggested that yesterday would be about the next version of Office. But then again they probably didn’t want it to appear to be focused just on partners, but to everyone.

Highlights

Integration with Metro – This is the version of Office that has the Metro sensibility. I suspect it will be the example used for what Line of Business applications in Metro should be until there are other, better instances out there.

Integration with SkyDrive – One of the default locations to store a document is on SkyDrive. That is, in the cloud. The integration is nice. But one of the interesting features was the ability to pick up right where you left off. In other words, edit a document on your desktop at work and save it to the cloud. On your way home, open up the document on your laptop and it moves to the exact place in the document you were at on the desktop. Finally, the same thing happens when you open the document on your Windows Phone device (although I’m not imagining much editing going on through that form factor).

PDF integration – PDF files can be opened in Word, edited and then saved as either a Word or a PDF document. Enough said to know that’s sweet.

Flash Fill in Excel – This one is tough to explain quickly. It’s basically a tool to help your parse data from a separate cell. Figure it like the following: You have a column of data that is tilde delimited (‘1~15.3~kyle~hockey’ is the first cell, ‘3~17.8~curtis~soccer’ is the second and so on. In a separate column, but on the same row as the first cell, you type ‘kyle’. In the cell immediately below that you start to type ‘curtis’. Excel now makes the prediction that you are trying to extract the third value out of that column and fills in the rest of the values down the column. It’s an edge case, but if you need this functionality (and I have in the past), this is way cool.

Other Features – A rotator control to select font families, sizes and colors. Integration with Bing so that search results can be embedded into Word directly (and the embedded HTML is live). A slightly improved UI that is still a ribbon, but the spacing between icons is greater to allow for easier interactions with a touch interface. A new presenter view for PowerPoint, including built-in zooming.

If you want to try the preview, you can get it from here: http://www.microsoft.com/office/preview/en. One word of warning. The C++ runtime used by Word is not compatible with the one used by Visual Studio 2012. As a result, you might get a warning indicating this when you launch VS2012 after installing the preview. There is a patch to this problem, available here:  http://www.microsoft.com/en-us/download/details.aspx?id=3017.  You’ll see this problem called out in the KB (http://support.microsoft.com/kb/2703187).

Microsoft Completed Build Deletion – where has our test results gone?

Did you know that the files and data that are part of a completed build can delete?

Did you know that the data deleted cannot be recovered?

Did you know by default Test Results from any automated tests run against the build are deleted by default?

If you have the right permissions you can right-click in Build Explorer on a completed build and select delete. When you do that by default all the items associated with that completed build are deleted.

·         Details: Information about the completed build that is displayed in Build Explorer. This information includes build steps, requestor, and date and time queued.

·         Drop: File and folders output by the build and copied to the drop location.

·         Test Results: Results of any automated tests executed during the build process or results of any test published against this build.

·         Label: The version control marker associated with the specific file versions used by the build process.

·         Symbols: The debugging symbols published to a symbol server during the build.

You can also configure the retention policy and set auto deletion rules . Nothing wrong with that however is the person responsible for the “Build” setup and maintenance deleting could be deleting Test Results?

There is an option that can be set to stop the deletion of your Test Results. 

Make sure your teams understands what happens when deleting completed builds. Set that option to keep your test results, unless you don’t want them!

 

Testa Smile