Thursday, November 08, 2007

Synchronizing AD with TFS

Just today I had the extremely frustrating task of forcing a synchronization between AD and TFS. The background of the problem was the inclusion of a user in an Active Directory group. The AD group was already associated with a TFS permission group.

After looking around on the net for a while, I came across this forum thread:
http://forums.microsoft.com/MSDN/showpost.aspx?postid=1403304&siteid=1

I invoked the web service to return the last time the ACL synchronization occurred and was a bit surprised to see that it was actually a lot longer than 1 hour ago. From what I read, if the web.config setting was not explicitly declared, it would default to 1 hour.

What I had to do was create a new TFS group, associated an Active Directory group with it, then delete the TFS group.

It's not a very elegant solution but there are many times where you must force the synchronization due to support requirements. I'm not impressed that no TFS web service exists to manually invoke the synchronization.

Sunday, November 04, 2007

TFS Custom FTP Task to Mainframe

Recently, I've had the opportunity to build a custom task to FTP code from TFS to a mainframe environment. It was quite an adventure but ultimately rewarding when it finally succeeded. I adapted code that someone had kindly written to wrap the WinInet.dll and used it to FTP code to a mainframe running ZOS. I tested the custom FTP task on my FTP server in the windows environment and it worked like a charm.



All that remained was adapting it to the mainframe, or so I thought. My first few attempts were met with dismal failure. The error that got reported was: The server has returned an extended error. 200 Representation type is Ascii NonPrint.



After doing some research, I uncovered the fact that what we take as folders and files are instead known as datasets and members on the mainframe. Mainframe datasets are dot-separated as opposed to slash-separated in the Windows environment. Thus, a dataset would look like this:

MCP.STAGE1.COBOL



The fully qualified dataset and member would then look like this:

MCP.STAGE1.COBOL(MyFile)



It took me a while to understand that the file name could not accept an extension. Thus, I had to write a parsing algorithm to strip it out.



Once that was done, everything worked until the point where it failed because the dataset ran out of space. Another quirk I had to learn about was that the dataset had an allocated amount of storage. Once it ran out, it could not expand further. On another side note, another quirk I learned about datasets was that when they were not used for some time, they got archived to tape. This took a while to track down but we finally understood after one of the mainframe experts explained why the first time things ran, it failed and always succeeded on the second attempt.



Moral of the story: respect mainframes and be flexible with your mindset. Don't automatically treat everything in the same way as Windows or the more recent flavours or Unix/Linux.

Tuesday, October 23, 2007

Team Foundation Server Role

Recently, I've been assigned to the Australian Tax Office as a Team Foundation Server administrator. It took quite a bit of adjustment and a whole lot of learning to adjust to this new role. From someone who routinely broke the build when I checked my code in, I became responsible for the code of a few hundred developers.

Software Configuration Management was never a field I dreamt of entering to be honest. I was always more focused on integration, collaboration and the user experience. As a result, I dedicated my time to learning how to use the Microsoft Smart Client Software Factory. I was very much inspired by my ex-career manager, Michael Daniels, who I hold in high esteem.

When I was first posted to the ATO, I spent a lot of time learning as much as I could about TFS and MSBuild (which is the foundation of TFS build types). Later on, I experimented with creating my own custom build tasks and learning how to integrate them into the builds.

After that transition period, I was straightaway thrust into work and had to implement an FTP task to a mainframe. The code for FTP was thankfully written by one of the architects at the ATO and he saved me the long road of having to experiment with API calls to implement FTPing up to the mainframe. There are some quirks around how the mainframe's file structure is handled and I won't go into the details.

After getting the FTP task to work, I recently adapted the build types to build both by latest and by label in a flexible and efficient fashion. My predecessor implemented a few custom Get tasks that were slow and took ages to process. The result was a timeout during the builds. The performance gain with the new build types was phenomenal! A build that used to take over an hour would take just 5 minutes. I'm still testing to see if this build type satisfies all the requirements but I'm fairly optimistic.

I'm currently enjoying my role as a TFS admin. I wouldn't thought I would ever say this but the learning experience and the responsibility satisfies me as a professional and it's a role I'd like to continue in at least in the near future.

After reading numerous blogs from experienced TFS experts such as Grant Holliday, Nagaraju Palla and from my colleague, Andrew Whitten, I discovered a community that is free with knowledge and helpful beyond expectation.

Here's to the future!