atom beingexchanged: May 2008

Wednesday, May 28, 2008

From the field

Brace over at Double-Take Software had a great report from the field with info for those folks who are about to move their Exchange infrastructure to a new location:

"A good engineer that I worked with on a Dell project contacted me yesterday and brought up a good question. He is working with a client in New England and they are in the process of planning to move their Exchange Server to a different site subnet. Because Exchange is so tightly integrated with Active Directory there is much more to plan for with moving Exchange than there is Double-Take. However, the question is below and I have some answers from our of our own PS engineers.

"I'm working on a project where the customer needs to change the IP address of all their existing Exchange 2003 servers and move them to another subnet. I'm familiar with the process for Windows and Exchange, but how does that affect Doubletake? Are there any KB articles that might explain the process?"

If using the DTAM functionality within Double-Take it should be as easy as re-enabling protection once the server has been moved. "Just disable protection before doing the re-IP'ing, then go back through setting up the protection afterwards. Should probably not select the "use last configuration" or whatever that option is...since it might get the wrong IPs."

Pretty easy but as a best practice you should always contact our technical support department for any tpe of planning deployment just to verify potential changes being made. Double-Take Technical Support can be reached at 24/7 at 800-775-8674 or 317-598-2066.
For Technical Support in the European Union +44 (0)1905 330820 +33 (0)1 47 77 15 06 +49 69 6897776-66
Or find answers online at support.doubletake.com!"


Of course, that client would probably want to check out the latest version of the Full-server Failover Option too, as that could make life a lot easier during migrations of Exchange, or anything else running on Windows.

Labels: , , , ,

Bookmark and Share
posted by Mike Talon at 0 Comments

Tuesday, May 27, 2008

Is it really SCR vs. Double-Take?

[blogger's note: Yeah, I phoned it in this week due to the holiday =) This is a duplicate to my blogging at the Double-Take Software official site (doubletakesoftware.wordpress.com). More original content will be coming soon.]

With the release of Exchange 2007 SP1, Microsoft finally released the long-awaited server to server log shipping systems for the Exchange platform. Standby Continuous Replication (SCR) – or Server Continuous Replication, depending on who you ask – is the idea that log files can be transmitted to another Exchange 2007 server, where they are played into a copy of the databases from the production system. That allows you to perform database portability in the event of a loss of the production server, and resume services as quickly as possible with little lost email. This technology is an extension of the Cluster Continuous Replication (CCR) and Local Continuous Replication (LCR) tech that was included in the RTM release.

So, one would think that Microsoft appears to be squeezing out 3rd party Disaster Recovery (DR) tool players, but nothing could be further than the truth. SCR is an incredible tool, but as with all the other tools included for free with Exchange, it has its limits. First is the restriction to one Store per Storage Group. This generally isn’t a true limitation for most organizations, as you can have up to 50 Storage Groups, but if you were already planning out your Exchange 2007 implementation and were planning on more than one Store per Group, you will have to change that configuration without the help of a 3rd party tool like Double-Take (see disclaimer at the bottom of the page).

Also, SCR contains no methodology for restoration and failback of the data and users. There are manual ways to use the SCR tools to accomplish this, but not in a bandwidth and time-effective manner. I wouldn’t go so far as to call this a shortcoming of SCR, as the tool was designed more for DR and not for immediately switching users back and forth, but it is a bit of a hindrance in an overall Business Recovery Plan, and does nothing to get you closer to Dynamic Infrastructure.
3rd party solutions like Double-Take can open up even more options when it comes to DR protection for Exchange Server 2007 - and all other supported versions as well. As a matter of fact, that may be the best case for the use of 3rd party tools; SCR only protects 2007 mailbox databases. If you are in a hybrid environment or have heavy reliance on Public Folders and don’t wish to rely on Public Folder replication, you will be unable to use SCR as a sole method of High Availability for non-clustered Exchange environments.

All said, SCR is a huge step forward for Microsoft, but in many environments it may still need a little help to meet robust DR goals.
Bookmark and Share
posted by Mike Talon at 0 Comments

Wednesday, May 21, 2008

TechEd 2008!

It's less than a month away, and I will be there!

Here's my QuickConnect Card:


Join Me at Tech·Ed Connect!
Bookmark and Share
posted by Mike Talon at 0 Comments

Tuesday, May 20, 2008

Double-Take VRA Announcement

Double-Take Software recently announced it's latest creation, the Double-Take Virtual Recovery Assistant. I've also posted on this one, but you can see the whole story at http://doubletakesoftware.wordpress.com
Bookmark and Share
posted by Mike Talon at 0 Comments

How big is too big?

In Exchange 2003, theoretical limits for Store sizes are 72GB for Standard Edition and 8TB for Enterprise Edition. While the Standard limits are well within the means of most hardware, 8TB is a whopping amount of data for any server to handle, much less one running a resource-intensive application like Exchange Server.

In reality, the largest single store I’ve seen in the real world was around 800GB, and even that one was suffering from performance issues. Most of these revolved around attempts to do full-text indexing, however even when that was removed from the equation, performance to Outlook users was still something that was far below par.

Everyone’s servers are different, and therefore everyone’s environments will also be different, but there are some universal constants in the Exchange 2003 world that will impact how big a Store can grow. First off, there are the physical limits to how much RAM can exist on an x86 system. Even with the /3GB switch, Exchange will have a limited amount of physical memory to work with, after which, virtual memory paging will begin. As Exchange is notorious for performance grinding down when paging is overused, this is a condition you want to avoid at any time. Therefore, each Exchange server will be limited to how many Outlook connections, SMTP connections and other resources that require RAM that it can run at any given time.

Secondly, third-party solutions may have issues as databases grow larger. I’ve personally seen Blackberry Enterprise Server start to have issues with larger Stores around the 500GB mark. Since BES requires the ability to stay in communication with the Store, as performance degrades, there is a significant risk that you could have BES issues, including re-scanning of the databases. This process can take up to several hours on larger databases, which means that you do not have mobile email during that time.

There are many articles written about sizing Exchange Servers properly, such as those at www.msexchange.org and others, so I won’t go into a lot of detail here. It is important, however, to keep aware of the size of Exchange databases, and how ever-growing email Stores can dramatically impact your servers’ ability to serve.
Bookmark and Share
posted by Mike Talon at 0 Comments

Monday, May 12, 2008

Everything old is new again

Just when it seems we’d left the old DOS prompt behind, Microsoft has reintroduced the command shell, with a vengeance. PowerShell, formerly Monad, is a command-line interface for Windows Server 2003, Windows XP, Windows Vista and Windows Server 2008. The idea is that most routine work done by engineers and admins can be boiled down to a series of repetitive commands. Instead of clicking boxes in a GUI, you run cmdlets – the PowerShell term for command sequences – and can automate a lot of your work.

While the PowerShell system has been around for nearly two years in one form or another; it was with the launch of Exchange 2007 that it got its first real-world function set applied to an application. Everything you can do in the Exchange Management Console can now be done in the PowerShell tool instead. As a matter of fact, all the EMC does is launch the appropriate cmdlet sets, even if you’re clicking on things. In addition, cmdlets let you do a lot of things you can’t accomplish in the GUI, such as Database Portability functions.

In some respects, the PowerShell system does make life easier. Typing get-StorageGroup into the command window and having the system spit back a list of all Storage Groups in the entire Exchange 2007 org is quite impressive, and much faster than navigating the GUI to get the same list. Likewise, get-mailbox spitting back not only all the mailboxes on a server, but also the storage groups they’re assigned to is quite convenient if you need to find info quickly.

The drawback is that there is a steep learning curve associated with PowerShell. As with any other command-line system there isn’t a set of contextual clues that hint at where information might be. While there is an extensive help system (invoked by typing “help [command]” where [command] is the cmdlet you’re having trouble with), you have to know what command you’re trying to figure out before you can get help on that command. There are, of course, many books and references available, both free and for a fee, so this isn’t an insurmountable problem, but it does pose a barrier to the novice.

PowerShell is a great step forward for Microsoft. It will lay the foundation for the Core Edition versions of Windows Server 2008, and allow for much more automation and flexibility for applications like Exchange. It’s also a great leap backward, since many of us have routinely used command-line systems since we got started in technology. Get past the learning curve, and you’ll find a useful and convenient tool at your service.
Bookmark and Share
posted by Mike Talon at 0 Comments

Friday, May 9, 2008

No flying cars, no SQL back-end

Comedian/Actor Lewis Black hit the nail on the head in one of his earlier bits when he recalled all the futurists of the past (try bending your head around that one) who gave him a false view of the 21st century. He waxes poetic about how dinner is not in pill form yet, he has no jetpack, and worst of all, in his now-immortal comparison between the 20th and 21st centuries, “No flying cars, no flying cars!”

Many Exchange Engineers are beginning to feel the same way about if we’re ever going to see a Microsoft SQL back-end for Exchange Server. Since the first rumblings of the Titanium Project – which eventually became Exchange Server 2003 – we have been salivating at the chance to have a true industry-standard formatted database on the back end of the Exchange system. It would open up new architecture possibilities, allow for more scalability and allow for more 3rd party integration.

However, the release of 2003 came and went with the ESE database still in place. So, we took it in stride and moved forward, living on the promise that in the next version we would see a SQL back-end, our very own flying car paradigm.

Then, early betas of Exchange 12 came out. Lo and behold, no SQL database. The EDB was still there, though they did finally get rid of the STM. Various reasons for this have made the round of the net, blogs, newsgroups, RSS feeds, etc. The two prevailing theories are that it just couldn’t be done with the architecture still structured the way Exchange needs, and that using a SQL back end could compromise security of the overall Exchange platform. Both very legitimate reasons to hold off, but it had been 4 years since the release of 2003, and Exchange 2007 changed the game so much that a new database structure wouldn’t have been shunned. As for security, a real concern, but one that probably could have been overcome with proper configuration and coding.

So, we have Exchange 2007 – an incredible product that changes the game when it comes to enterprise messaging and collaboration. But we still have no SQL database back-end.

Then and now, to paraphrase Mr. Black, “No SQL back end, no SQL back end!”
Bookmark and Share
posted by Mike Talon at 0 Comments

Friday, May 2, 2008

TimeData Webinar

I’ll be hosting a Double-Take Software TimeData webinar on May 21st, 2008 at either 10AM or 3PM EST.  We’ll be discussing TimeData in general, but we will be talking about Exchange protection quite a bit, so feel free to join us!


Register for the TimeData May 21 Webinar


TimeData, for those who haven’t seen it yet, is a Continuous Data Protection tool designed to get back data that has been lost to accidental or malicious destruction (like corruption, virus attack, etc).  It works side-by-side with Double-Take for server protection, but is designed to get data back when the production server is still working for everything else except the data you lost.


 

Bookmark and Share
posted by Mike Talon at 0 Comments

Are public folders going away?

Microsoft has “de-emphasized” public folder systems in Exchange 2007 in favor of SharePoint integration for both desktops and Outlook Web Access.  While that looked like a great idea in the initial release, a combination of user push back and the lack of public folder to SharePoint conversion tools has lead MSFT to put significant public folder management tools back in to 2007 with the release of Service Pack 1.


So, are public folders here to stay?  It would seem that many users sill rely on them for communication and sharing files, but that can be problematic in many ways.  First off, it increases the size of the Exchange Server’s data stores - often dramatically.  It also makes finding things that are squirreled away in public folders much more difficult than finding the same objects in a SharePoint system.  SharePoint indexes files and folders much more effectively, and does so without creating overhead on the email/calendar/contact management systems to boot.


Yet, with the intense reliance on public folders in the Exchange world today, even though it would be better and more effective to jump to SharePoint in 2007, the vast majority of Exchange users did not, and rallied to get MSFT to put those functions back into the product.


This desire to stay on public folders was only enhanced by the lack of tools designed to help migrate off public folder-based systems.  There are a few great tools out there, but nowhere near the level of systems designed to, say, migrate from Lotus Notes or other mail systems.  Without the ability to get the existing public folder data into some other software package like SharePoint, most existing Exchange users will find the migration a horrific uphill climb.


Finally, there will always be end-users resistant to change. With email, calendars and contact, you can leave your users with applications they already know - such as Outlook 2003.  But with public folders, users that are used to working with Outlook will often be resistant to learning how to navigate and use SharePoint and other tools.  This may be the biggest reason that Public Folders are not going anywhere fast, no matter how much MSFT threatens to pull them from the next version(s) of Exchange.


 

Bookmark and Share
posted by Mike Talon at 0 Comments

Testing BlogJet

I have installed an interesting application - BlogJet. It's a cool Windows client for my blog tool (as well as for other tools). Get your copy here: http://blogjet.com


"Computers are incredibly fast, accurate and stupid; humans are incredibly slow, inaccurate and brilliant; together they are powerful beyond imagination." -- Albert Einstein

Bookmark and Share
posted by Mike Talon at 0 Comments

Hi There!

This is a blog about Microsoft Exchange, or rather it's about to be a blog about Microsoft Exchange. Hang in there while we get this site up and running.
Bookmark and Share
posted by Mike Talon at 0 Comments