Opening up Outlook’s data format

In Q4 last year, Microsoft announced through its Interoperability @ Microsoft blog that it was planning to open up its proprietary PST email format used by Outlook.

The data in .pst files has been accessible through the Messaging API (MAPI) and Outlook Object Model (two things of which my understanding is minimal at best), but only if the user has Outlook installed:

In order to facilitate interoperability and enable customers and vendors to access the data in .pst files on a variety of platforms, we will be releasing documentation for the .pst file format. This will allow developers to read, create, and interoperate with the data in .pst files in server and client scenarios using the programming language and platform of their choice. The technical documentation will detail how the data is stored, along with guidance for accessing that data from other software applications. It also will highlight the structure of the .pst file, provide details like how to navigate the folder hierarchy, and explain how to access the individual data objects and properties.

The documentation will be released under Microsoft’s Open Specification Promise, which means that it is protected against patent claims. Other Microsoft Office formats, such as the XML-based .docx and .xlsx, and the older binary formats .doc and .xls, are covered under this promise.

This seems like a big win for users of Microsoft Outlook. Along with CodePlex, which hosts open source projects, it seems like Microsoft is slowly opening things up and making life easier for their customers. It certainly has the potential to make it easier for customers to leave the Outlook platform. From GigaOM:

In the past, if someone was moving from Outlook/Exchange to Gmail or any other platform, there was a pretty tedious process of exporting pieces of data from Outlook into various formats before moving over to the new platform. Basically, once you didn’t have Outlook, that .pst was a useless brick of data. Now in that case you’ll be able to take that .pst file with you and if other apps/platforms build readers, they will be able access that data. So migration to other platforms is a valid use case where there’s some benefit.

Some more ideas as to the reasons why Microsoft is making this change were floated on ZDnet a day after the announcement:

[Rob Helm, an analyst with Directions on Microsoft,] added that he believed Microsoft is trying to wean large customers from storing mail in .PST files or file systems “because doing that makes it hard for organizations to back up all their e-mail, enforce e-mail retention policies, and locate relevant e-mails during legal discovery.”

Not just retention, but perhaps helping organizations mine their email data for knowledge which can all too frequently be lost forever if an employee leaves the company? Here’s an idea: How about a tool that will gather information from emails dating back years and populate a wiki automatically for new employees?

[Rob Sanfilippo, another Directions on Microsoft analyst] added that .PSTs “are used most frequently for archiving purposes and Exchange Server 2010 includes a new server-based Personal Archive feature that gives users a separate mailbox to use for archiving on the server instead of using a PST.” He said this gives weight to the aforementioned idea that Microsoft is trying to help organizations get users off PSTs and onto server storage.”

Then, in February of this year, the promised documentation was released on the MSDN website. Finally, about a month ago, two open source tools that make use of the documentation were released on CodePlex:

  • The PST Data Structure View Tool is a graphical tool allowing the developers to browse the internal data structures of a PST file. The primary goal of this tool is to assist people who are learning .pst format and help them to better understand the documentation.
  • The PST File Format SDK is a cross platform C++ library for reading .pst files that can be incorporated into solutions that run on top of the .pst file format. The capability to write data to .pst files is part of the roadmap will be added to the SDK.

The project has seen some exciting progress, which is good news for organizations that use Outlook. And as you might know, data visualization used to enhance understanding is a favourite topic of mine!

What risk do these developments address within Outlook’d organizations? Knowledge/information management is critical to so many companies. The use, retention and (hopefully) reuse of knowledge developed by employees and stored in email conversations within Outlook will be enhanced through this openness.

Has your organization taken these developments into account in your audits of knowledge/information management and strategy?

E&Y: Internal Audit should drive strategy

BusinessDay, a South African business news website, published a recent article referencing an E&Y study involving “more than 100 industry analysts from more than 20 disciplines”:

Organisations need to break out of the compliance cocoon and evolve into a fully fledged leadership role that delivers real value to the business. In the current economic climate, the biggest risk for most companies is not a failure to meet compliance requirements, but a failure to meet strategic targets.

The study also assessed last year’s top 10 business risks. In it, the analysts ranked the aftershocks of the credit crunch and the deepening global recession as the most important business risks, displacing regulation and compliance from the top spot.

Still more evidence that the Internal Audit profession demands an expanding skill set and well-rounded people with experience in more varied aspects of business. Auditors are going to have to continue to push themselves outside of their comfort zone in order to provide the greater value that shareholders require of the function.

How does your IA department stack up?

Survey says: IA feeling the squeeze

A survey conducted at the recent Institute of Internal Auditors annual conference by Protiviti has revealed that â…” of IA professionals believe their department is under-resourced and therefore unable to adequately carry out their duties.

Protiviti’s take is that due to increased expectations of the assurance Internal Audit can provide on an ever-widening spectrum of enterprise risks, auditors feel under-resourced. Sukhdev Bal, Director of Protiviti says:

This survey is a clear indication that internal auditors themselves believe that prior to the recession, they were not fit for purpose in terms of focus, skills and capabilities. Audit committees, Internal Audit leaders and management need to work more closely and collectively to agree the role of audit, objectives, criteria for audit and the overall approach of the internal audit function required to meet current and future evolving needs. Importantly, having agreed these, they need to ensure that the function is staffed with the right skills, capabilities and experience to meet these objectives.

There is evidence that spending on governance, risk and compliance didn’t decrease in 2009 compared to 2008, so I think Protiviti is correct with its assessment. IA is being asked to expand their risk coverage beyond traditional areas of expertise. It’s only natural to feel a little overwhelmed by the expectations. The key to adapting in my opinion (and experience) will be support for training in non-traditional areas.

The survey is available on Protiviti’s website (if you give them some personal information first).

Payroll system conversion horror story

Converting their payroll system has resulted in some serious errors to the tune of greater than $1.5 million for the Fort Worth (Texas) school district.

The school district overpaid employees and former employees at least $1.54 million, according to the [internal] audit. It also found that the district’s payroll system lacked proper controls, was cumbersome and inconsistent, and included manual paper entries that led to human error.

Aside from the poor conversion, it doesn’t sound like the new system is all that great if it requires manual entries. I’m assuming the entries are needed because the payroll system doesn’t interface with their general ledger system. Additional review controls over the process between systems is required in that case.

Some trustees are seeking an independent audit of the problems to get more assurance that fraud wasn’t a factor and that all the issues have been resolved.

[Trustee Christene] Moss said she wasn’t comfortable with parts of the report in which the [internal] auditors could not determine why various issues happened.

Yeah, I’d be concerned about that too! As well, the auditors aren’t certain that all the overpayments have been identified and fixed. I think these are the main reasons why an independent audit is needed. The situation calls for a specific engagement looking at the system conversion process and subsequent issues.

Board President Ray Dickerson reiterated that he didn’t think there was a need for a costly external audit. He said controls will be put in place.

[…]

Dickerson said the problems that were found are typical in such a transition.

“No matter how well you plan and train, once you flip that switch, you’re going to find things you didn’t know,” he said.

Uh, not really dude! And certainly not $1.5 million worth of “things you didn’t know” (on a monthly average payroll of $41 million)!

As a not inconsequential footnote, the conversion to a new system was required because the old system’s vendor was no longer going to be supporting it. A quick search for “open source payroll software” turns up many options which will prevent vendor lock-in in the future.

Update: Another story, this one in the Fort Worth Weekly, has more details about the internal audit’s findings and the attempts by the district to have some former employees repay the erroneous amounts.

Survey says: ERM implementations maturing

A survey conducted in July and August of 2009 by Aon has revealed that companies are moving beyond “basic” ERM implementations:

62% of the survey respondents in the Global Enterprise Risk Management Survey 2010 reported going beyond basic ERM, compared with only 38% in Aon’s inaugural ERM survey in 2007.

I wonder what happened between now and 2007 that would’ve affected companies’ willingness to ramp up their risk management practices…

The survey asked respondents (of which there were 201) to rate the maturity of their ERM implementation, from “initial/lacking” through “basic”, “defined”, “operational” and “advanced”.

My take is that respondents are more likely to overestimate the maturity of their implementation and generally more likely to respond the more advanced they (feel they) are in the process. Still, the survey is a welcome indicator that ERM efforts are on the rise.

I also think the fact that ratings firms are taking ERM into account when they determine their grading is helping executives point to a tangible financial benefit and obtain buy-in from all stakeholders, which is critical. In my mind the primary indicator of maturity in a company’s risk management program is how comprehensive it is across all departments and divisions, as the “initial/lacking” stage is exhibited by a rigid, siloed approach.

The survey is available on Aon’s website (if you give them some personal information first).