Yes, I said "admins," but, unfortunately, this isn't an announcement. I'm simply thinking of a version of Visual Studio built for, well, admins. Specifically, I'm envisioning an environment to build and debug PowerShell scripts and cmdlets. Of course, this is only a hop, skip, and a jump away from PowerShell as an official .NET language. Imagine that, PoSh.NET... or, would someone force it to be P#? Either way, I like the idea. As an official language, that also opens it up to compiled scripts, which would be great for those servers without PowerShell installed. Of course, it's just a matter of time before PowerShell is default and the legacy DOS shell is eventually phased out. As a matter of fact, that's the plan for Windows 7 and Server 2008 R2. I'm not sure about Server Core, tho, since there's still the dependency on .NET.
In what must be a huge coincidence, the MSDN subscriptions team just informed us about an updated feed for MSDN downloads (link is to en-US feed). While I'd like to think this was due to the my prodding, I think it's pretty safe to say it's not. Besides, this is just a revamped version of the old feed, which just covers the subscriber downloads. The team's post sounds like it's a very good thing. I'm not sure if anyone else followed the old feed, but the quality bar wasn't very high. I consistently noticed that new downloads weren't announced. This should fix that. Unfortunately, it won't change the fact that you'll get a multitude of announcements for products released in different formats (i.e. ISO, EXE, DVD). At least you can get it for your desired language, tho.
Windows Live FolderShare used to be the most important tool in my arsenal. That changed when Live Mesh was in beta. As soon as I was added to the pre-beta program, FolderShare has been but a mere memory. That decision may have been a questionable one, however. There has been talk about FolderShare being rebranded as Live Sync in the past and the FolderShare team just confirmed it. The bigger picture, however, is that what we now see as "Live Mesh" is only a sample application on top of the Windows Azure platform. The key here is "sample application." I always knew the sync and remote desktop capabilities within Mesh were intended to be a proof of what's possible, but it was never posed as an app that would eventually be dropped. Granted, I'm an early-adopter and am used to a few cuts and bruises, but it would have been nice to see where this was going. Maybe that doesn't matter, tho, because I don't think I'd change anything. Live Mesh is significantly better than FolderShare and I imagine Live Sync won't be much better in its initial incarnation. Soon, however, we'll start seeing Live Sync take over more and more of what Live Mesh has to offer. How long will it take? Only time will tell. I'd like to say 6 months, but the FolderShare team is notoriously slow. The only thing we really have to cling to is that the FolderShare... err, Sync team should be able to take a lot of what Mesh has today. The impression I got was that this was an evolutionary improvement, tho. We'll see what happens next month. Either way, be ready to switch from Mesh to Sync. I'm hoping the team manages to automate that upgrade/migration so it will be relatively transparent. The only problem I see today is the format FolderShare uses during the sync process; however, it sounds like Sync will change all that. Like I said, tho, we'll see...
Do you have an MSDN subscription and want to know what was updated? Are you supposed to receive DVDs, but think you are missing a few? Unfortunately, I find myself answering "yes" to both of these questions. I was ready to complain about it, but then someone pointed me to the MSDN Subscription Index. Using this site, you can search for products, view shipment contents, or just see what's new or removed in a shipment. The site has a fairly crappy experience, but at least it gives you the information. I'm also somewhat annoyed that there isn't an RSS feed. Since I find myself so curious, I'm going to try to keep up with what gets released and share it in my blog feed -- if you're only interested in these updates, there's also a feed just for the MSDN subscription updates . I may be late with the updates because it's a manual process, but I'll also pursue having the MSDN team produce their own feed. I should mention that there is a feed for the latest downloads . I'm going to focus strictly on the DVDs that are released. If I have time, I'll go back a few months and create separate posts for each of the releases. Hopefully, that will make the feed fairly sensible. Having one big catch-up post seems a bit much. These posts will be back-dated, so if you see this post after others, that's why.
Ya know, I'd be remissed if I didn't say I'm partially expecting someone to email me or comment with a link to an existing RSS feed. There is an MSDN Subscriptions blog , but that doesn't seem to tell me what I want to know. If someone knows of a good source, I'd love to hear it.
If you're using Windows and aren't using ZoomIt, you're missing something. We all run into those scenarios where we see an image, but it's not quite as big as we'd like. For those situations, Microsoft gives us ZoomIt. There's not a whole lot to explain here. Download it, run it, and press Ctrl+1 to zoom in. From there, move your mouse around to reposition the view and press up and down to zoom in and out further. There are also options for drawing, typing, and even a timer. These are for presentations and I don't use them much, but I'm sure others find them more useful. If you haven't tried it before, ZoomIt is definitely worth at least trying once. You can even run it from the web to stay up-to-date with the latest release.
I'm a month and a half late, but the Resharper nightly builds are back! I guess I stopped checking after not seeing any movement for a while. I'm glad to see some activity, tho. This is the most beneficial add-in to Visual Studio I've seen; especially as a productivity geek. What I've been most surprised about is the overall quality of the nightly builds in the past 6 months. Simply outstanding. If you're asking yourself whether to give it a shot or not, I say go for it. You're likely to run into minor issues, but if my experiences are indicative of how well they manage their day-to-day development, this is a team with a very tight ship. I always grab the latest and try to update a few times a week, depending on what I'm in the middle of. If you're not quite as confident as I am, grab a "works here" release. I'm sure you'll see how great this tool is in relatively short time. An absolute must-have for all code-focused developers.
I know some people have asked in the past how to map latitude/longitude on Live Maps and I just ran into a scenario where I needed to do it. Well, it's as simple as typing in the desired lat/long, selecting Locations, and doing a search. Admittedly, it's not the most intuitive thing in the world, but I figured it out in a matter of seconds.
If you're looking to link directly to a lat/long position, you can also pass in the lat/long via querystring parameters. There are a number of different querystring parameters, and I'm not going to go thru them all here, but I will mention the two most important: cp and where1. The cp parameter is the lat/long position you want to center on. Using this alone will center on the lat/long at the default level, which isn't much help. For this reason, I'd recommend you also specify the where1 parameter, which performs a search for the lat/long. The key thing to remember here is that cp delimits the latitude and longitude by a tilde (~), whereas where1 uses a comma. Here's a sample URL: http://maps.live.com?cp=36.16773~-115.157181&where1=36.16783%2C-115.15707.
Over the years, I've been asked to put together coding standards again and again. The nice thing about this is that it enables me to pull out the old docs and touch them up a little. A year or two ago, I heard something that made a lot of sense: developers never really read coding standards and, even if they do, they don't usually adopt them. Let's face it, if you don't adopt a standard as your own, you're not going to use it. The only way to ensure the standard is applied is to catch the problem before it gets checked in. I tried a VS add-in that attempted to do this as you type, but it wasn't quite as extensive as I want, but I grabbed onto the concept. For the past year, I've been wanting to start this and have finally decided to do it.
As I sat down and started to investigate writing custom code analysis rules, I asked myself how I was going to validate them. After hacking away at one approach after another, I started to realize I wasn't going to get very far. Apparently, with the latest releases of Visual Studio and FxCop, there's no way to create the objects used to represent code. After talking to the product team, the official position seems to be that, since custom rules aren't "officially supported," they're not going to support their testability. I'm not sure who made this decision, but I think it's a bad one. Of course, I say this without knowing their plans. Well, not completely, anyway.
It's not all bad news, however. It turns out there are hopes to start officially supporting custom code analysis rules in the next major release, Visual Studio 10. Nothing's being promised at this point, it's just something the team would like to deliver. I should also say that the upcoming Rosario release isn't the major release I'm referring to. I'm expecting Rosario to be a 9.1 release that will probably hit the streets in early 2009. That's a guess, tho. If that's true, the VS 10 release probably wouldn't be until 2011. All I can really say about it is that it'll be a very exciting release. I can't wait to get my hands on a beta. Speaking of which, some of the goals they have for the product will make beta testing much much easier... I'm talking about a hugely evolutionary change, if not revolutionary, considering where the product is today. That's all I can really say, tho.
Back to the point, since there's no realy testability of the code analysis framework, I decided to create my own object model. The part I'm missing, obviously, is the factory logic that converts code analysis types to my types. I'm hesitant about this approach, but it's working so far. Hopefully, I'll have something to deliver soon. I keep bouncing around, tho, so at this point, I want to deliver a release with only naming conventions. That release is mostly complete, I just need to get approval for a distribution mechanism. If I don't get that soon, I'll just release it on my site.
It's amazing what sticks and what doesn't. Back in Aug 2004, I caught wind of UpdateVersion, a tool Matt Griffith wrote to update version numbers in AssemblyInfo files. The tool is pretty simplistic, but provides an absolute benefit. Every couple of months, I get asked for a copy of the changes I made... despite the fact that they've been available online for years. Nonetheless, it's about time I created a project on CodePlex for the utility. At this point, I don't really expect to make any changes to it, but I will if someone sees value in it. If I were to make any changes, I'd probably go ahead and convert it to .NET 3.5 and possibly even add a PowerShell cmdlet.
I just wanted to share a small script that creates a new profile and registers the ps1 file extension. For those that don't know, ps1 is the default extension for PowerShell scripts. Of course, there's a difference between ps1 and bat or cmd. If you double-click a ps1 file in Windows Explorer, it'll open in Notepad. What's up with that!? Apparently, this was done for security reasons. The idea is that, since you can't simply double-click on the file to execute it, hopefully you'll actually look at it to make sure it's not going to kill your system. PowerShell is much more dangerous than traditional batch files are, so this is probably a good thing. With that in mind, PowerShell, by default, doesn't allow you to even execute these ps1 files. To do that, you have to set the execution policy. Anyway, here's the script...
$dir = [System.IO.DirectoryInfo]$profile
New-Item -Type Directory -Path $dir.Parent.FullName
I found this online a while ago, but I don't remember where. The only other thing I want to mention, since I imagine some people might freak out by it is the New-Item cmdlet, is that there's a function that simplifies this call and gives us a familiar DOS experience: md. I always thought md was an alias, but never bothered to consider why/how the New-Item cmdlet was determining that you wanted a directory.