Blog

In the press: Data dieting for dummies

/ Technical
April 27th, 2012

data dieting for dummies

You know you’ve got a problem when people start denying it, writes Nick Booth. “You’re not a fat company,” your so called analyst friends will say, “You’ve just got big data.”

Big data. It’s one of those weasel words they use to avoid embarrassment. When you’re out of ear shot, they’ll have no such sensitivities. “Did you see the state of his NAS?” an unscrupulous salesman will say, after he’s left you some leaflets about upgrading to a bigger size.

It happens to us all. One day, we get a glimpse of your infrastructure and ask ourselves; “Whose is that huge array?”. Then you’ll realize that it’s your data that’s being mirrored.

It’s a massive problem; the information lifeblood of the average British company is becoming increasingly sclerotic. Circulation is slow, and bandwidth pressure is sky high. It’s all those fat video files and JPEGs – they’re all loaded with megabytes. The body of the SME cannot use these, so the brain tells the network to store them, in the subcateneous layer of the infrastructure, on vital organs such as the hard drive and on the extremities (AKA the datacentre).

Your mind is tricked into thinking this information is useful. “Don’t throw that talking head video away,” a marketing buffoon will say, “We could analyse this for trends later.”

So you start hoarding all kinds of crap which you kid yourself is useful; ‘Facebook Likes’, from people you’ve never heard of get stored, along with all kinds of other social media crud from start-ups with less shelf life than a sell by date avocado, but whose flabby bits will have a much longer legacy.

There’s a 90:90 rule for data, if we understand Robert Rutherford, MD of QuoStar Solutions, correctly. “90% of data is written once and never read again,” he says. On the other hand, any data that hasn’t been looked at for 90 days needs to be archived at least.

Read the full article on MicroScope