0

So You Were Forced to Use the dreaded TFS Collection /Recover Command, Now What?

by Angela 11. October 2012 08:23

Since we have used Recover on a production database and lived to tell the tale I thought I would share our experiences. If you read this post you will know that one of my client’s got themselves into a world of hurt where we needed to restore a nightly backup that was not detached.  I know, I know, detached backups are the way to go.  Well, now THEY know that too Winking smile  Nonetheless, sometimes you may find yourself needing to recover a TFS Team Project Collection (TPC) database, and if you’ve read the MSDN documentation you’ll know this is not an ideal situation. The Recover command is very lossy, BUT you get your data back. And in our case it was worth the risk.

So here is the backstory…  Someone deleted a Test Plan with a month’s worth of data in it, and if you know MTM you know there is no “undelete”. Restoring a backup was our only hope. BUT our nightly backups are SQL backups of the entire SQL Server instance, so undetached (we are addressing this NOW). Plucking one TPC out of there and attaching it is just not an option. And we did not have hardware to restore the entire thing and detach it properly.  So here is what we did:

  1. Restore the backed up TPC from the nightly backup into our dev TFS environment
  2. Used the TFSConfig /Recover command, followed by TFSConfig /Attach to get it attached in dev
  3. Used the TFSConfig /Recover command to get the TPC into the proper state
  4. Detach the hosed TPC from production
  5. Restore that detached version of the TPC to production
  6. Attach the backup to production (we actually hit an interesting bug in TFS 2010 at this point, so the attach was quite harrowing and involved an emergency hotfix to our TFS sprocs, I may blog about later.)

Now, I would love to say everything was perfect but the recover command did blow away some things that we had to get back into place before people could use the TPC again.  What we lost:

  1. All the security setting ever!
    • Collection level groups and permissions
    • Team Project (TP) level groups and permissions in every TP in the TPC
    • Permissions around Areas and Iterations in every TP in the TPC
    • Permissions around Source Control in every TP in the TPC
  2. SharePoint settings  (in every TP in the TPC). Settings on the SharePoint server themselves will be fine of course but you will probably see a “TF262600: This SharePoint site was created using a site definition…” error when you try to open the portal site that was once attached to those TPs. You will need to fix this in 2 places.
    • Go to TFS Admin Console, select the TPC you just restored and make sure the SharePoint Site settings for the TPC are correct. It will probably be set to “not configured” now.
    • Open team explorer (as an Admin user), and for each TP go to “Team Project Settings | Portal Settings” and verify everything there is correct. Ours were just plain gone so we had to enable the team project portal and reconfigure the URL.
  3. SSRS Settings – this will probably be fine if you restored the database as-is but we also renamed it as part of the restore, and so had to update the Default Folder Location through the Admin Console for the TPC in order for this to work again.

So word to the wise, make sure you understand what the settings above are for all of the TPs in your TPC BEFORE you perform a Recover command because chances are you will have to manually set them all back up.

Tags:

ALM | Application Lifecycle Management | MSDN | MTM | Microsoft Test Manager | Microsoft Test Professional | TFS | TFS 2010 | Team Foundation Server | VS 2010 | Visual Studio | TFS Administration

0

So you accidentally deleted your MTM Test Plan, Now What?

by Angela 10. October 2012 04:14

So this week, we had a little bit of fun, by which I mean a day that started with panic and scrambling when someone accidentally deleted a Test Plan (yes, a whole test plan) in MTM in production. A well established test plan with dozens of test suites and over a hundred test cases with a month’s worth of result data no less... Some important things of note:

  • test plans are not work items, they are just a “shell” and so are a bit easier to delete than they should be (in my opinion)
  • there is no super secret command-line only undelete like there is for some artifacts in TFS, so recreate from scratch or TPC recovery are your only options here to get it back
  • when you delete a test plan, you lose every test suite you had created.  Thankfully, not test cases themselves, those are safe in this situation.  Worst case, a plan can be created, although it is tedious and can be time consuming.
  • when you delete a test plan, test results associated with that test plan will be deleted*. Let that sink in – ALL OF THE TEST RESULTS FOR THAT TEST PLAN, EVER, WILL ALSO BE DELETED.  ::this is why there were flailing arms and sweaty brows when it happened::

So at this point, you may be thinking it’s time to update your resume and change your phone number, but hold up. You may have some options to recover that data, so buy some donuts for your TFS admin(I like cinnamon sugar, BTW).  I should mention, there may be a lot of other options but these are the three I was weighing, and due to some things beyond my control we had to go with #2.

1) Best Case Scenario: restore your DETACHED (this is required) team project collection database from a backup, cause you’re totally taking nightly backups and using the TFS Power Tool right? You lose a little data depending on how old that backup is, but it may be more important to get back your test runs than have to redo a few hours of work.

2) Second Best Case Scenario: If you cannot lose other data, and are willing to sacrifice some test run data, then restore the TFS instance from a traditional SQL backup to a separate TFS instance (so, NOT your production instance), open up your test plan in that secondary environment, and recreate your test plan in production.  Not ideal, but if you didn’t have a ton of test runs this may be faster and you don’t sacrifice anything in SCM or WIT that was changed since the backup was taken.

3) Worst Case Scenario: if your backups were not detached when you did your last backup, cry a little, then use the recover command to re-attach them. The gist is to use the TFSConfig Recover command on the collection to make it “attachable” again, then attach it to your collection. I have written a separate post on this because it can be complicated…

Once you are back up and running, make sure rights to managing test plans is locked down!  It might not be obvious that you can even do this, or where to find it, since it is an “Areas and Iterations” level permission. But do it, do it now!  This permission controls the ability to create and delete Test Plans, so be aware of that. But for the most part, anyone with authority and knowledge to delete entire Test Plans, considering what they contain, should be the only person creating them.  If everyone needs the ability to create/delete these willy-nilly, then you are doing it wrong, in my opinion anyway.

I am still in the midst of getting this back up and running so will update once we’re finished. There is an MSDN forum post out there regarding one bug I seem to have uncovered, if anyone wants to look at it and maybe fix my world by answering it Smile I am sure I’ll be able to add some more tips and tricks by then.

0

How I Got my nVidia Driver to Stop Puking on Flash Videos

by Angela 6. September 2012 14:04

***WARNING: this post is 99% rant and 1% helpful tip. You have been warned! ***

Maybe you have seen it too. You’re browsing along happily and… nVidia drivers poop the bed (a.k.a. “Display driver stopped responding and has recovered”), and if you are really lucky, you have to force a hard boot of your system.  At the very least, your browser crashes. Fun, fun!

image

So 99.9% of you won’t give even the slightest damn about this, but for ME, this is huge.  And let’s be honest, given how many read this new blog of mine, .1% equates to someone’s left elbow.  Anyway, I’ve had my Lenovo W520 for about 10 months now. Ran like a dream until I tried running any kind of Flash video embedded in IE 9 (and at one point Chrome and FF crapped out too but only required a Flash update). Given how much YouTube I look at (training, all training! really…) this was going to be an issue.  I tried updating my nVidia drivers, many times. Tried uninstalling and re-installing the drivers, the browsers, Flash. Changed my setting between nVidia Optimus mode and discrete. Flashed the BIOS. Cursed and waved my fists. Downloaded the nVidia command center, which is awful BTW. Nothing worked, and so I just got used to using other browsers whenever I could.

No snarky “OH, you’re using IE, THAT is the problem” or “Adobe is the evil, just don’t install it”!! I am an IT professional who works for lots of clients who DO use those technologies and I must be able to support them, end of story. I eventually got into a QA position where IE9 was required, and the site I was testing was FULL of Flash videos.  So back into the depths of hell known as support forums I went.

nVidia, IE and Adobe forums sent me down a hundred useless paths and I almost gave up. I eventually stumbled upon this little nugget.  At first it seemed another dead end but then, I saw it [clouds part, rays of light stream down from on high]. Wait….what? Right click the freaking video and turn off hardware acceleration?

image

image

That’s it?! Well, I mean, I never even knew you COULD right-click the video and mess with Adobe settings. Besides, normally my browser crashes so fast I don’t have the option. So I tried, about 8 or 9 times because, you see, I had to right click, change the setting and hit close all before IE crashed. But eventually it happened, and it worked. So far so good!  So maybe you have run into similar issues and this helps you. If so, you are welcome Smile

Now, back to watching “Call Me Maybe” covers. I mean, MVC training videos. [snickers]

Tags:

Flash | Adobe | Drivers | Lenovo | nVidia

0

Are You on Windows 8 RTM yet? Get going in under 30 minutes flat!

by Angela 29. August 2012 10:33

So, I have been on the Windows 8 Customer Preview since about February, and I owe that to Eric Boyd who published this kick-ass tutorial with video, to walk you through the process of setting up a Boot to VHD environment for Windows 8.  He really “dumbs it down” which I love. And let’s be clear, I am not dumb, but I am effing LAZY, and if I can accomplish the same end result in fewer, simpler steps, SIGN ME UP!

Since I do a lot of client work, I didn’t want to completely scrap my current Windows 7 setup with all of my Visual Studio/MTM/TFS 2010 tools.  At the same time, I really want to be using Windows 8 with Visual Studio/MTM/TFS 2012, without giving up access to my 8-core processor and 16GB of RAM.  I know, I am a lucky girl Smile  So, having a dual boot option with Win 8 running darn near native was the perfect setup for me.  I can choose to boot to either Win 7 + VS 2010 or Win 8 + VS 2012. I know, VS 2010 can live side by side with 2012 but I had encountered issues in the past with them fighting each other, and would rather not mess with it.  Plus if I ever need to pave Windows 8 again, it’ll take me all of 30 minutes to do it, obviously, or I wouldn’t be writing this post. 

Here is where it gets really awesome. There wasn’t much to the update.  I literally went to the drive that housed my old Win8.vhd and deleted it. Now, be sure you’ve backed everything up, unless like me you are using some kind of wonderful back-up tool like SkyDrive or DropBox.  Just be sure to disconnect the image from your backup software before blowing it away so you don’t accidentally tell it to clear your stored data!

Back to my point. I simply went to my c:\VHD folder, deleted the Win8.vhd file, and then went back through Eric’s tutorial using the latest Windows 8 iso. Easy as pie! 

Just to recap, this was literally all I had to do:

  1. Download the Win 8 iso from MSDN and write down my product key
  2. Then according to Eric’s instructions perform the following commands (your drive letters may vary):
    • create vdisk file=C:\VHD\Boot\Win8\Win8.vhd maximum=60000 type=expandable
    • select vdisk file=C:\VHD\Boot\Win8\Win8.vhd
    • attach vdisk
    • create partition primary
    • assign letter=V
    • imagex /info G:\sources\install.wim  <—only difference is a second options 1 = professional, 2 = core. So most of you will still pick “1”
    • imagex /apply G:\sources\install.wim 1 V:\
    • bcdboot V:\Windows 
  3. Boot and enjoy!  

Like I said, about 35 minutes door-to-door to get the VHD configured and ready. Applying the Win 8 iso was literally the longest running step in the entire process (24 minutes for me), but alas we cannot speed that up. Once you boot into your Win 8 VHD, there will be some more setup, especially if your past experience was with the Preview. Here is a run-down of what to expect here:

  1. Choose the Win 8 Options from the boot menu
  2. You’ll see a message about “Getting devices ready”, this may take a few minutes
  3. “Getting Ready” message appears
  4. I was then prompted again to select my OS (I left the room so it must have rebooted as part of setup)
  5. Then enter your product key and accept terms
  6. Choose some personalization options (colors, wireless router to connect to, express settings, etc.)
  7. Sign into your account
  8. Win 8 installs some base apps, gives you a little color rotator show while things happen, applies some final settings and off you go!
  9. Don’t forget to download SkyDrive to re-sync all your stuff, and turn on the HyperV feature!

Ahhhhhhhhhhhh…

WP_001022

Tags:

Windows 8 | SkyDrive | Hyper-V

0

Visual Studio 2012 Launch Event Coming to Chicago in September!

by Angela 29. August 2012 04:50

You might have heard that the official launch of Visual Studio 2012 is coming soon! Alas we cannot all afford to hop on a plane and head out to Washington State to party with the product team. BUT, lucky for you, there are also going to be local launches held at major cities across the U.S. You might not have noticed because all the marketing jazz has been heavily focusing on the Windows Azure part of that event, but there is going to be some great content around the development tools as well. Now you know!

Join Polaris Solutions at this free launch in Oak Brook, IL (about 20 miles west of Chicago) event to check out some of Microsoft’s newest leading-edge tools, including Microsoft Visual Studio 2012, Windows Azure, Windows Server 2012, and Microsoft System Center 2012. You'll get the opportunity to engage with the experts (like me), get hands on with the new technology, and learn how to build modern applications both on-premises or in the cloud using the Microsoft platform.

A special Visual Studio 2012 launch track was recently added to the CHICAGO event with a keynote from Brian Harry himself. I know, cool right?! Smile In his talk, you will learn about how Visual Studio 2012 can help you evolve your development practices to maintain relevancy, adapt to change and deliver on the needs of the business, rise to the challenge of the “New Normal”, and elevate your skills to keep pace with the fast changing world of application development and delivery. Be sure to stop by after the keynote and visit us at the Polaris Solutions booth as well!

At the event, you will also be able to participate in a raffle for a chance to win an Xbox 360 + Kinect Bundle.  Get registered soon before it sells out:  https://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032521310&Culture=en-US&community=0

Tags:

ALM | Application Lifecycle Management | Microsoft Test Manager | Microsoft Test Professional | SDLC | TFS 2012 | TFS | Team Foundation Server | Visual Studio | Visual Studio 2012

0

August Chicago ALM User Group - Announcing Git Integration with TFS

by Angela 16. August 2012 10:43

I know, Microsoft supporting non-.NET developers and non-Windows folks? Inconceivable! ::gasp:: 

OK, so if you’ve been paying attention for the past couple of years, you might already know that this has been happening slowly. But recently there have been some seriously MAJOR developments. First, Microsoft made Entity Framework open source, and now they have added MVC, ASP.NET and more to that list. Dogs and cats, living together, mass hysteria…and all that.  Then when you thought it couldn’t get crazier, they announced TFS integration with Git!  My head just exploded a little, how about yours?

Come to the Chicago Microsoft office on August 29th and meet one of the TFS product team members, you heard it, ONE OF THE DUDES WHO WRITES CODE FOR TFS ITSELF! Edward Thomson will be discussing how to take advantage of the new git-tf tool to synchronize a local git repository with Team Foundation Server.  This cross platform bridging tool is especially useful for cross-platform developers, such as iOS developers on Xcode.

Edward Thomson is a Software Development Engineer at Microsoft, where he works on cross-platform version control tools for Team Foundation Server.  Before joining Microsoft, Edward worked on numerous source code control tools for Microsoft and Unix platforms.

Register now to make sure you get a spot. Building security also requires it, and it helps me not order gobs of food no one will show up to eat.  So help a girl out huh?

Tags:

ALM | Application Lifecycle Management | MSDN | SDLC | TFS | Team Foundation Server | VS 11 Beta | Visual Studio | Open Source | git | TFS 2012

0

Have you registered for that conference yet? Er, I mean, thatConference!

by Angela 23. July 2012 07:48

My husband and I actually registered months ago, and I’ve been looking forward to thatConference for years, literally.  Back in 2010 when it was just a twinkle in Clark’s eye, I remember thinking “holy cow Summer 2012 is a long way away!” and yet here we are, just 2.5 weeks away.  But don’t worry, it’s not too late.  There is still plenty of time to register if you haven’t already, and I believe there is still a significantly discounted room block as well so don’t put it off any longer.

Be sure to check out the sessions schedule, there is going to be quite an amazing lineup of speakers and topics across many platforms.  Speakers will be talking about everything from Java to . iOS to MVC and ASP.NET. Considering how much cheaper it is than TechEd, VSLive and most of the other events you’ve probably been wanting to go to for years AND that the content looks to be just as good, how can you afford NOT to go? Besides, I’ve been to PDC and VS Live, none of those conferences are held at awesome water parks, host free game nights for everyone, or include a free poolside pig roast. Seriously, this is going to be RAD!

One more thing, we still need additional sponsorship! If you, your company, or any other organization you know of would be interested in getting involved or getting some excellent exposure to a huge group of the Midwest’s geekiest finest, please consider becoming a sponsortoday.  There are sponsorships at MANY levels and all of them are important to our success.  Our hope is to deliver the best possible experience for our attendees and ensure that thatConference will return in 2013 in an even bigger way!

I am really looking forward to a weekend of geekery in the Wisconsin Dells next month. Hope to see you there Smile

Tags:

0

Microsoft Test Manager (MTM) Tip O’ the Day–Filtering test lists

by Angela 3. July 2012 07:41

Now, I am no @ZainNab, the guru of “Tips and Tricks”, but I occasionally run across features that have been staring me in the face for YEARS and yet somehow went completely unappreciated, sometimes unnoticed.  And then one day it hits me and OMG my life is easier, and I want to tell everyone.  Sure, it’s a bit embarrassing to admit sometimes given that I worked at Microsoft for 5.5 years focusing on the Visual Studio tools, but who hasn’t done that?  Not you? Really?  I am skeptical…  There are after all, a bajillion commands to try and remember. For real, if you don’t believe me, look at the entire book that Sara Ford and Zain wrote about it. It’s worth every penny and Amazon has a great deal on it, pick up a copy! Smile

So, back to my point. I was sitting in MTM, looking at a fairly daunting list of PBI based test suites, thinking “now which PBI’s were the ones where I had test cases to run again?”  I started thinking about writing a query, but that only helps is YOU are assigned to the test case, it doesn’t really help with test RUN assignment. Then it all came flooding back.  Wait, there’s this FILTER button to sort that out.  And conveniently it’s right there in front of my face ::face palm::  I felt a little better when no one else admitted to noticing it was there either. Maybe they were just being nice to me.  Either way, in case you didn’t notice it, check it out. Before:

Untitled

After, I have MUCH fewer test suites that I have to look at:

Untitled2

That’s my Microsoft Test Manager tip o’ the day!  I won’t be posting them every day like Zain has been doing on his blog around Visual Studio 2010 for the past couple of years, of course I also don’t mainline 5 hour energy like he does Smile  I will do them whenever I can.  Hope this was helpful! Feel free to post any tips of your own or shoot me a note if you have other questions or comments.

0

June ALM User Group Meeting–Acceptance Testing Using SpecFlow!

by Angela 17. May 2012 06:12

Get ready, we have a packed summer full of great topics at the Chicago Visual Studio ALM user group! Be sure to join us in beautiful downtown Chicago at the Aon Center in June for this next session on how to improve your user acceptance testing practices using SpecFlow! Be sure to pre-register on our user group site so we can get you entered into the security tool, and please do keep us posted if you have to cancel! We don’t like throwing away food and it helps me to order the right amount.

Topic Description:

Imagine a project you’ve worked on in the past. Whether or not you or your organization makes use of Agile processes, you probably spent a good deal of time going back and forth with business stakeholders on the fine detail of how the software you’re building should behave. It’s possible you had to dedicate effort simply to producing a demo that the business will appreciate and understand. It’s even more likely that at some point, you and the business had disagreement(s) on whether something was “working”, “finished”, or “done”. Those types of discussions can leach away at your team’s time, expend effort, and impact morale as well as create tension between development teams and the business.

Now imagine if you could instead pour that blood, sweat, and tears into developing your application’s functionality. Imagine a scenario where new features are authored test-first, by non-tech staff in a plainly understandable, shareable, and versionable text format. Imagine a situation where the same set of specifications can be shared to drive a browser-based test suite at the same time that the specifications drive an integration test suite. These are the types of scenarios that tools like SpecFlow are particularly well-suited to address.

Unit tests are great for verifying atomic pieces of software functionality, but they are very poor at capturing and communicating specifications at any other resolution than fine-grained. They’re also completely useless to a non-technical user attempting to understand a system’s functionality.

This is where acceptance testing enters the picture. Although commonly classified as BDD (Behavior-driven design), tools and frameworks like SpecFlow serve to bridge the gap between proving the correctness of a piece of code from the inside, micro perspective and the correctness of an application as a whole from the outside perspective.

In this talk, we’ll go over what acceptance testing is, when it should be used, and how to add acceptance testing into an existing application using SpecFlow. We’ll also talk a bit about DSLs (domain-specific languages), the pyramid of returns vs. effort when it comes to different types of testing, techniques for authoring and designing tests and bindings, and finally, because this *is* a group about ALM, how to integrate SpecFlow into a CI environment and why you or your organization should do so.

If attendees wish to follow the demo on their laptops, they can save time by pre-installing the VS tooling for SpecFlow – http://specflow.org. The download there adds some tooling support within the VS IDE, and is not needed to run SpecFlow.

 

Speaker Bio:

Josh Elster is the founder and principal of his independent production and consulting company, Liquid Electron. With clients ranging from small media design shops to multi-billion dollar corporations, Josh’s experience spans a number of different sectors, projects, and roles. In February of 2012, Josh joined the community advisors board for Microsoft’s Patterns and Practices team for the CQRS journey project (http://cqrsjourney.github.com), as well as being a contributor. Like the common cold, but without the whole being ill aspect, it is Josh’s hope that he can infect others with his passion for software development. When not serving as Patient Zero, Josh can be found reading, playing video games or guitar, or coding. His website can be found at http://www.liquidelectron.com. His Twitter handle is @liquid_electron. His most recent demonstration project, the PostcardApp, can be found at http://www.postcardsfromskyrim.net.

0

An interesting Quest (pun intended)…into Agile testing!

by Angela 9. May 2012 08:57

So there is a fantastic little conference gaining steam in the Midwest called Quest, which is all about Quality Engineered Software.  If you’ve never heard of it, you should seriously check it out next year regardless of your role.  As I have always said, Quality is NOT the sole responsibility of the testers, and this conference has something for everyone.  I was fortunate enough to be introduced to the local QAI chair who runs the conference the first year it ran (2008), which lucky for me also happened to be in my back yard.  I was with Microsoft at the time, and we had opted in as the biggest conference sponsor, cause let’s be real - who on earth in QA ever thought “Yeah, Microsoft has some awesome testing tools”.  ::crickets::  Right.

At the time VSTS (remember THAT brand? Smile with tongue out) was still new-ish, and the testing tools were focused almost entirely on automated testing. Yeah, I know, TECHNICALLY there was that one manual test type but let’s not even go there.  I know a few, like literally 3, customers used the .MHT files to manage manual tests in TFS, but it wasn’t enough. The automated tools were pretty awesome, but what we found was that MOST customers were NOT doing a lot of automation yet. Most everyone was still primarily doing manual testing, and with Word and Excel, maybe SharePoint. We had a great time at Quest talking to testers and learning about what they REALLY need to be happy and productive, we got the word out on VSTS and TFS, and started planning for the next year.  I was able to be part of Quest as a Microsoftie in early 2009 as well, when the 2010 tools (and a REAL manual test tool) were just starting to take shape, and then the conference spent a couple of years in other cities.  Fast-forward to 2012 when Quest returned once again to Chicago.

I was no longer a Microsoftie, but if you’ve ever met me you know that working a booth and talking to as many people as possible about something I am passionate about is something I rock at, and enjoy! So I attended Quest 2012 again this year, this time as a guest of Microsoft.  I worked the Microsoft booth doing demos and answering questions about both the 2010 tools and the next generation of tools, and WOW did we get some great responses to them.  Particularly the exploratory testing tools.  I am pretty sure the reverse engineering of test cases from ad-hoc exploratory tests, and 1-click rich bug generation that sent ALL THE DATA EVER to developers gave a few spectators the chills. I certainly got a lot of jaws dropping and comments like “THIS is a Microsoft tool?!” and “I wish I had this right now!”. It was pretty great.

I was also fortunate enough to also get to attend a few pre-conference workshops, keynotes and a session or two.  I have to say, WOW, the conference is really expanding, and I was very impressed with the quality of the speakers and breadth of content.  As a born again agilista, I was so pleasantly surprised to see an entire TRACK on Agile with some great topics.  I was able to attend “Transition to Agile Testing” and “Test Assessments: Practical Steps to Assessing the Maturity of your Organization“ and learned quite a bit in both sessions.  One disappointment, there is even more FUD out there in the QA world than what I see in the developer world when it comes to Agile, what it actually means and how it SHOULD be practiced.  I’m not about being a hard core “to the letter” Scrummer or anything, but I also am not about doing it wrong, calling it Agile, and blaming the failure on some fundamental problem with Agile.  There are lots of Agile practices that can be adopted to improve how you build, test and deliver software, without going “all in”, and that was something I kept trying to convey whenever I spoke up.

I heard “Agile is all about documenting as little as possible”, “Agile lacks discipline”, “Agile is about building software faster”, and all of the usual suspects you would expect to hear.  No, it’s about "documenting only as much as is necessary; there is a difference!  Agile requires MORE discipline actually.  People on Agile teams don’t work faster, they just deliver value to the business SOONER than in traditional waterfall models, which sure, can be argued is “faster” in terms of time to market.  The only thing that will make me work faster would be a better laptop and typing lessons.  I still look at the keyboard, I know :: sigh::   I am highly considering doing a session next year on Mythbusting Agile and Scrum, to help people understand both the law and the spirit of Agile practices.  Overall it was great to see that the QA community is also embracing Agile and attempting to collaborate better with the development side of the house. We just need the development side to do the same Winking smile  I also met at least a dozen certified Scrum Masters in my workshops as well, which was great to see! 

One of my favorite parts of the conference was of course getting to catch up and talk tech with Brian Harry.  He was the first keynote presenter of the conference, and spoke on how Microsoft “does Agile”, the failures and successes along the way, and even spent some time talking about his personal experiences as a manager learning to work in an Agile environment. I.LOVED.THIS. Yeah, I’m a bit of a Brian Harry fan-girl, but it really was a fantastic talk, and I had many people approach me in the booth later to comment on how much they enjoyed it. My favorite part was Brian admitting that at first, even HE was uncomfortable with the changes. It FELT like he was losing control of the team, but he eventually saw that he had BETTER visibility and MORE control over the process, and consequently the software teams.  It was brilliant.  So many managers FEAR Agile and Scrum for just those reasons. It’s uncomfortable letting teams self organize, trusting them to deliver value more often without constant and overwhelming oversight by project managers, and living without a 2 year detailed project plan - that in all actuality is outdated and invalid as little as a week into the project.  Wait, WHY is that scary? Sorry, couldn’t let that get by.

And so off I go again, into the software world, inspired to keep trying to get through to the Agile doubters and nay-sayers, and to help teams to adopt Agile practices and tooling to deliver better software, sooner.

Tags:

Agile | ALM | Application Lifecycle Management | TFS 2010 | SDLC | Team Foundation Server | Testing | Test Case Management | User Acceptance Testing | VS 11 Beta | VS 2010 | Visual Studio | development

Powered by BlogEngine.NET 2.7.0.0
Original Design by Laptop Geek, Adapted by onesoft