Author: Adam Fowler

Coping with Infinite Email

Automatic Deletion of Deleted Items with Retention Policies

Exchange 2010 and 2013 have an option called “Retention Policies”. I’ll base the below on what I see for Exchange 2010, but most of not all should apply to 2013 also.

Retention Policies are useful if you need to keep your user’s mailboxes clean, as well as trying to avoid a Deleted Items folder with every single email the employee has received in their time with the company. You can work out what the company agrees with for what can and can’t be auto deleted, and save a lot of money on space for both live information and backups.

The Retention Policies are made up of “Retention Policy Tags” and these tags “control the lifespan of messages in the mailbox” as quoted by one of the wizards that you configure this in mailbox. The Retention Policy is then targeted at the mailboxes you want to apply these settings to.

Gandalf-You-Shall-Not-Pass-Ian-McKellenMaybe not this wizard.

It’s worth noting that a mailbox can only have one Retention Policy linked to it, so you need to plan overlapping settings accordingly.

So, what can a Retention Policy Tag do? You give it a ‘Tag Type’ which is either a folder in someone’s mailbox (e.g. Deleted Items) or every other folder that isn’t an inbuilt folder. From that definition of what folder the tag is on, you can either set an age limit for all items in that folder, or set the items to never age.

deleted items

The Age limit is a number in days. This number actually means something different depending what Tag Type was targeted. For an email in the Deleted Items folder, it’s based on the date the item was deleted by stamping it at the time of deletion. There’s some caveats around that, so refer to this chart on TechNet which lays out how the Retention Age is calculated.

There’s also a Default Archive and Retention Policy (called MRM Policy in Exchange 2013) that is applied to all mailboxes that have no other policy applied, if archiving is enabled (remember that can only be one). So if you have simple requirements, use this policy. For more complex requirements, you’ll need multiple policies and either manual management of mailboxes to apply the right policy, or use a script that’s run at regular intervals.

Once you’re set up, the policies are enforced by the Managed Folder Assistant. This runs on an Exchange server, which is controlled from the service Microsoft Exchange Mailbox Assistants. This used to be schedule based (Exchange 2010 pre-SP1) but SP1 onward and Exchange 2013, this is an always running throttled process. It’ll do it when it’s the ‘right time’ based on several criteria and checks. If you want to know the specifics, read this from TechNet.

To check that the policy has applied, you can go to the properties of the folder of the mailbox in question (for me it’s Deleted Items) and you’ll see the policy listed:

deleted items 2

You can also look at the individual emails to see both the retention policy applied, and when the email will expire. This is what I see from Outlook 2010:

deleted items 3

If you want to process a particular mailbox right now because you’ve just configured something, you can use the PowerShell command:

Start-ManagedFolderAssistant -Identity “guyinaccounts”

If you want to do more than a single mailbox, you’ll need to pipe it. Again, more details here on TechNet. The Event Viewer on your Exchange server should tell you how it went, but from some of the information I’ve read, a Retention Policy that’s only just been targeted to a mailbox can take up to 48 hours to actually recognise and start processing. For me it took more than a few hours before I could see the policies on my emails.

One last point, when you first create and apply a policy is when Exchange will start tagging emails. For my example, I set it to 60 days Delete and Allow Recovery, on the Deleted Items. This caused all exisiting deleted items that went back a few years to get marked for deletion 60 days from when I applied the policy. It won’t go back and instantly delete your older items.

Who Will Be There For The Long Run?

You may have noticed that the theme on my blog has changed. The theme I was using was a light version of a pro product, which I didn’t buy. I was looking at changing some small settings and discovered that the creator of the theme had stopped supporting it a few months ago.

Knowing that I’d probably have issues in the future, I decided to find a different theme. It had to work with the content I already had and look pleasing enough to me. I also didn’t want a v1.0 theme, because that gives me no assurance that the creator has any interest in updating it when future WordPress versions are released.

I realised that this same methodology is how I approach most pieces of software. Ideally it needs to have been around for a little while to prove they can deliver, and keep their product updated. It needs to have good support, either from the community or the creators. It needs to integrate well with existing systems, but also not cause you to be locked in to the product itself.

After working in I.T. for a while, I’ve found this is instinctively how I think. A big factor would be learning this from when things go wrong – from implementations, upgrades or changeovers and considering what decisions should have been made early on to prevent this.

This in itself causes issues, because how can a software solution get customers if everyone wants something that’s already proven? Companies will often take risks if option B is substantially cheaper than option A, or the vendor of the software have proven themselves with other solutions… but generally it’s safer to go with the proven solution.

Maybe this methodology is changing with the rapid release cycle we’re now seeing globally. It’ll probably cause more issues due to less testing time and more updates, which instinctively is the opposite of what we’ve all learnt to do in IT. This applies to the cloud too – you’re putting your faith in a 3rd party, but you have no visibility or control over changes. Without that visibility, how do you know everything of yours will work after the fact? Or will you be left trying to find another cloud vendor that works with your existing setup?

So yes, I have a new theme. It works, and it’s free. It’s newer than v1.0 so at least there’s some evidence that it will be maintained, but they may stop this at any time. I’m not giving them any money so I can’t complain, but it’s still the fundamental basis of my decision process. It’s luckily quite easy to change themes because of the well designed plug and play style of themes. This is what I expect from any software vendor (but rarely met), and anything beyond increases the risk of pain – it may not be now, but chances are it will come.

ioSafe 214 NAS Review

The ioSafe 214 NAS was provided to me by ioSafe to check out. I’ve looked at a few NAS units before, but generally low end devices. This unit is far from low end, having both advanced management capabilities and superb physical protection.

diskstation

“Superb” is a big call, but this NAS is fireproof and waterproof. Trevor Pott and Josh Folland tested the fire side of this here (The Register) which is rated at 1550ºF for 1/2 an hour, and the water side is rated at 72 hours with 10 foot depth. There’s a bunch of videos on YouTube too if you want to check those out. I chose not to test these specifications as I really liked the unit.

Full specifications are available here from ioSafe’s website, but here’s a quick rundown. The NAS is dual bay, and will officially take up to two 4TB SATA drives. There are 3 USB interfaces (a single USB2 on the front, and two USB3’s on the back), with the back also containing a single gigabit ethernet port and a power port. The only other item of interest is the copy button on the front which I’ll go into later.

The ioSafe 214 is ‘powered by Synology DSM’ which I think just means it has a Synology 214 inside it… which I was very impressed by. I’d pictured the web interface of the NAS as some unexciting poorly designed experience, but this was similar to using a desktop with shortcuts and programs.

Here’s the ‘desktop’ which you’ll see after logging onto the NAS via HTTP:

iosafe1

I’m still impressed now after using this for a few weeks. The left hand side contains these highlights:

File Station – This lets you create and manage shares and the files/folders within

Control Panel – This opens the control panel as per the screenshot above. There’s a huge amount of options here, including setting up LDAP/Active Directory connectivity, user management, device updates, index your media located on the drives and so on.

Package Center – this is the Synology App Store. You might think this isn’t exciting, but for starters everything is free. There’s tools like Antivirus and DNS Server, but also Asterisk (want to run your phone system off this?), Mail Server, MediaWiki, RADIUS Server, Tomcat, VPN Server , WordPress and so on. This turns a basic NAS into a server with a multitude of abilities.

One extra application of note is the ‘Download Station’. This will download from a bunch of different protocols: BitTorrent, FTP, HTTP, Newsgroups, eMule (is that still used?) and a few others I haven’t even heard of before. I’m sure a lot of people leave a box on just for downloads, so this would eliminate the need for that.

On the right hand side are ‘Widgets’ – yep, just like the ones from Windows Vista and 7, and were killed off due to vunerabilities. Anyway that doesn’t apply here, these are configurable but I decided to show the connected users, storage use, system health and finally the resource monitor that displays usage of CPU/RAM/LAN.

There’s also a few other important areas a few clicks away, with the most important being ‘Storage Manager’:

iosafe2

This is where you can create iSCSI LUNs and manage the physical hard drives inside the ioSafe. Creating a LUN was really easy, and they have the ability to thin provision. This means you can over-subscribe the storage – for example, you might have 2tb free like I do above, but you could create a LUN with 2TB of space, and another with 1TB. It only uses the space you actually write to, but you avoid having to guess and lock yourself in to certain LUN sizes early on. The only risk is if you run out of disk space you’ll start to get issues, and you wouldn’t realise it just looking at the LUN from a remote PC.

Personally I created a LUN that took up the whole 2TB available (1.79TB of real space) and then created another small 1GB LUN which I used as a Quorum for clustering.

Also as a quick speed test, I copied the Windows Server 2012 R2 ISO (which weighs in at 3.97GB) from a local machine to the NAS via iSCSI, and it copied over at 33 seconds. The copy averaged 115MB/s.

Copying a file back to the local host was much slower, which would be an indication of the local single spindle of the HDD, and came in at 45 seconds for the copy, averaging around 80MB/s.

The final area worth mentioning is Backup & Replication:

iosafe3

Again, there are a lot of options here. This takes away from relying on a remote device such as a PC to do backups, and allowing the NAS to look after itself. You can back up contents from one area on the NAS to another, or plug in an external disk via the USB3 ports and take it away for offsite backup requirements. There’s even Amazon S3 as a backup point – not something I’d use for large amounts of data, but it’s a nice addition.

So what is the end result from all this? It’s a NAS that is easy to set up and maintain from Synology, wrapped up in great armour from ioSafe without having ridiculous pricing. This unit is ideal for a home user or small business that needs 4TB or less data highly secured – and for an extra few hundred vs a non ‘armoured’ NAS, it’s an easy decision.

Note: If you want the same features but need more drives, ioSafe also have an ioSafe 1513+ which has five HDD bays instead of two.

TechEd North America – Done for 2014

TechEd North America 2014 is now over. You can read about the first two days of my experience here. The second half wasn’t too different to the first half unsurprisingly, and there wasn’t a huge amount of excitement in the air.

Wednesday morning started off slowly. There were a LOT of vendor parties on the Tuesday night beforehand, so maybe it was a difficult morning for many attendees. There wasn’t much to do as once breakfast was over, there were breakout sessions to attend (where you go into a room and listen to a presentation – one of the biggest parts of TechEd), but the Expo hall (where all the vendor booth are) didn’t open until 11am.

I found it difficult to push myself to attend the breakout sessions because they were all available on the next day for free via their Channel 9 service. It’s a great idea from Microsoft but many attendees I spoke to shared the lackluster of going to these too, saying they could watch them online later.

There were some highlights of sessions though. Anything with Mark Russinovich (creater of SysInternals) was highly talked about, and I attended “Case of the Unexplained: Troubleshooting with Mark Russinovich” which was really interesting to watch.

For lunch, I caught up with Nutanix to have a look at their offering over lunch. They treated me in style, by giving me a Texas-style hat and using someone else’s leg power to get me there and back:

Bnoa_cNCUAAZRXV

 

I learnt that Nutanix offer a well priced sever based solution that’s half way between a single rackmount server, and a full chassis/blade setup that also uses shared storage between the nodes (i.e. blade servers). I’ll definitely be looking into that further from both a writing view as well as investigating for my place of work.

After that, I explored the Expo again, speaking to more vendors. Yes there was a lot of goodies given away (generally called ‘loot’) but again according to other attendees, there was a lot less than previous years. I didn’t really try and came back with a suitcase full of novelties which my work colleagues will hopefully go through and find some cool bits and pieces to make up for my absense.

Wednesday night came, and night time means more parties. I went to the Petri meet and greet where as the title suggests, I met and greeted another bunch of great people. After that the jet lag had gotten the better of me, so I went back to the hotel to order room service and pass out.

Thursday saw the final of Speaker Idol. It’s a competition run by Microsoft in the American Idol format (apparently?) where people perform 5 minute presentations until a winner is chosen, and that winner gets to present a full breakout session at next year’s TechEd. Aidan Finn ended up winning (and he wrote about the experience here) who was highly deserving of the achievement, but so were the other presenters I saw.

I had dinner with the friendly eNow mob who make reporting and monitoring tools for Exchange, Lync and others, as a +1 to someone who was actually invited.

The closing event was held at the local baseball stadium: The Minute Maid Park.

WP_20140515_20_11_04_Pro

Not having been to an American stadium before, it was more of a novelty to me more than others. Jugglers, artists, many stadium type food stalls and a mechanical bull surrounded the outskirts while attendees took tours of the pitch itself and listened to the bands that played. Here’s the full list of everything that was available. Disappointingly I wasn’t feeling 100% due to a cold, otherwise I would have sampled some of the nachos covered in American liquid cheese – something rarely seen in Australia.

Overall I’m really glad I went (I may not have been as positive on the very long plane ride home) as I met a bunch of great people – Particularly Kyle Murley and Phoummala Schmitt who both looked out for me, as well as Trevor Pott who convinced me to go in the first place. I made lots of new contacts, and had the opportunity to say hi to tech greats like Mary Jo Foley.

TechEd North America – Half Way Mark

It’s now Wednesday 14th May, and we’re at the half way mark of TechEd North America 2014. This is my first TechEd outside of Australia, and it’s been an interesting experience. A lot of the following reflections will be due to my TechEd Australia exposure which gave me certain expectations.

For starters, the community is really a great group. Almost everyone is very courteous and respectful which is inviting and welcoming to someone who’s traveled here by themselves. It’s very easy to just start talking to someone, as everyone seems genuinely interested to find out more about others and have a chat. For example, as I was sitting writing this, someone mentioned that I should eat something as I hadn’t really eaten much of it. We had a quick chat about jetlag, and I thanked him for his concern.

I’ve been told it’s a sold out event, with about 11,000 people in attendance, which dwarf’s Australia’s 2000-3000 headcount. The venue itself, the Houston Convention Center is huge, along with all the areas inside. The general dining area looks bigger than a soccer field to me.

The Expo area is about as big, which contains all the vendors giving away shirts, pens and strange plastic items, while trying to convince you to know more about their products. The staff are quite nice too, not being too pushy. There’s also a yo-yo professional, a magician and probably other novelties that I’ve not seen yet.

BncuT8rCIAAGWIN

Many competitions are going on with the vendors too. One had a chance to go bowling with Steve Wozniak and I was standing next to him which was awesome. Sadly didn’t win the bowling part though:

Bnh-ufWCYAESXZw

 

There’s a motorbike to win, countless Microsoft Surfaces, headphones and other bits and pieces that vendors are using to get the attendees to come visit.

Moving onto the keynote (which I liveblogged here), the focus was Mobile first, Cloud first. There wasn’t much noise from the crowd for the whole keynote, as most were probably coming to terms with having to start worrying about Azure now. Microsoft made it very clear that Azure was THE way now, not just an option.

The announcements of the keynote were all features for Azure. Good features which others have written about in detail, but no new products or services. Not even a mention of the upcoming Surface 3 and Surface mini. No mention of Nokia either, but there was an iPad on stage to show off some Microsoft technologies. Times have changed!

There’s hundreds of sessions going on every day, so we’re rather spoilt for choice. I’ve only been to a few and expecting to focus on that more today, but they’re a bit part of what makes up TechEd and so far have been very informative. The 1 hour and 15 minute format means they don’t go on for too long, but don’t feel rushed.

Microsoft also decided to make one exam free to all attendees – the 70-409 Server Virtualization with Windows Server Hyper-V and System Center. I decided to take it and passed, which was a nice bonus.

Vendor parties make up all the non-TechEd times and there’s many going on at once – again spoilt for choice. It’s another great way to meet others and find out what’s going on for other IT professionals, while sampling the local food and beverages.

There hasn’t been a huge buzz from attendees, but everyone is still happy to be here. It’s been a good two days so far, and I’m looking forward to the next two!