Clearing the fog from the cloud: A GTR roundtable (part 2)
This is part 2 of a GTR roundtable on cloud computing that brings together experts from several different parts of the industry. Part 1 ran yesterday and can be read by clicking here.
GTR: How much of an issue are concerns about data sovereignty?
HANRAHAN: It’s definitely something that has affected the decision-making; if you can’t keep data in the country, with government clients the cloud discussion almost immediately comes to a stop.
We made the fairly significant decision to deploy a public cloud infrastructure into each region we deal in, and to strike our contracts in the region we’re signing up in. That means you can control whether your data leave the country or not – and that any dispute resolution or local laws can be catered for in the contract.
It has taken significant investment on our side, but we believe the data sovereignty issue is such a high priority that there’s no other way to address it. And it’s starting to pay off: it becomes almost a fundamental entry point for discussions. The fact we can say ‘yes, it stays in country’ is a positive move forward.
BIVIANO: The obstacles are still the unknowns: there is a degree of discomfort amongst some individuals about moving data into public clouds, and this is really about the provider being able to offer guarantees and assurance about security and compliance levels.
GTR: Governments have had mixed (and generally unsatisfactory) results with outsourcing and shared services in the past. How can they make cloud work where previous approaches failed?
DAN: The notion of utility, on-tap computing, that can be turned on and off as required, with the control residing back with the organisation, is a significant departure from the traditional models. So, too, is the notion of being able to continually benefit from advanced in technology, capability and computing power, without being locked into an infrastructure or environment that may age for the duration of a traditional outsourcing or shared services agreement. These are strong differentiators for cloud computing and we already see indications that modern, next generation outsourcing arrangements are incorporating provisions and capabilities to take advantage of the benefits of utility computing.
BIVIANO: If you’re going to have a SaaS provider for a part of your business, you’re going to have a lot less control than a cloud provider might be able to offer. But you can still look at your centralised security policy: you might mandate two-factor authentication for them, or that only certain levels of information are stored in that.
GTR: SaaS, IaaS and PaaS offer different value to different agencies. Which offers the best entry point to the cloud, and what risks do they involve?
BIVIANO: From a security perspective, none is harder to secure. It’s about control – but you’ve got to ask yourself whether you want that control, because with the power to control comes the responsibility to do it properly.
For example, SaaS offers a lot of agility and is is often used as a quick way of solving problems. But if you’re an organisation storing data with any level of sensitivity, you need to drill down into the SaaS provider’s SLAs and assurance documents to ensure that what’s going to happen with that data is in line with your requirements.
DAN: Cloud computing is here to help address the requirements of the customers, so it is really these requirements that dictate the most suitable entry point to the cloud. For some organisations it may be the need to have a particular customer relationship management system, and it’s very easy to try a SaaS offering and get one easily. Similarly, if an organisation needs a bursting or testing or disaster recovery capability, they can very easily buy some IaaS and use it for the duration of the project, then hand it back. So, the point of entry varies according to the needs of the organisation.
If we’re trying to take a look at what catalysts appear to have a contribution to organisations adopting cloud services, we have recently worked with Ovum, who produced a report that puts forward a Cloud Services Catalysts Framework for government organisations looking at embracing cloud. That framework differentiates different catalyst profiles of organisations embracing IaaS, PaaS and SaaS, and looks comprehensively at business needs, leadership decisions and digital culture or “internet-age thinking”.
HANRAHAN: What’s not clear to most government organisations is how much impact paying for services on consumption will have versus their traditional, and still very valid, capex budgeting process. There is hesitancy as they ask ‘if I go to a full consumption model and end up consuming more, how do I manage that?’
But some of the smaller departments are seeing that they can save money on consumption, and that’s where we’ll see more and more acceptance of cloud in the next six months: people picking up an individual set of workloads, and saying ‘let’s see how this plays out’.
GTR: The promise of pre-rolled infrastructure is particularly appealing for resource-constrained local governments. Are they particularly well suited to tapping into cloud models?
HANRAHAN: We’re seeing a lot of proofs of concept rolled out across state government, which is driven by the acceptance that it’s just another delivery model – but one that can change the cost position for those smaller agencies. A lot of those agencies don’t have the ability to continue to invest in infrastructure around new projects and new models because of their budget constraints.
However, local government often doesn’t want to be first; they are at a higher risk, and quite often they follow and don’t lead. But they’re in the position as some of the smaller state agencies: they have very limited budget, have continuing demand and limited services. And if they can deliver those on a consumption basis rather than a capex model, it will become an important part of their strategy.
DAN: I recently spoke at a local government conference and I was amazed at the things they are achieving on the back of very lean and frugal operations. In addition to lack of suitable funding for IT operations, especially in rural and regional areas, they lack suitable skills and lack the infrastructure that we take for granted in cities.
Personally, I think the provision of IT as a service will help address these challenges. In terms of budgets, the removal of costly capital expenditure may help remove lumps in the budget that may not be easy to absorb. Similarly, the provision of services “from the cloud” may decrease the quantum of requirements for specialised skills.
In addition, as cloud computing offers are being adopted, we are likely to see a drive towards standardisation across multiple users – thus allowing local governments to share their skills and experiences and collaborate more, thereby contributing to a body of best practices and increasing the benefits they derive from their investments.
So, in conclusion, I think that cloud computing will really allow governments to provide better services that meet the expectations of their communities, and ultimately connect better with those communities.
BIVIANO: Each tier of government has its own challenges when it comes to budgets, and this is one of the areas where cloud does help them a lot. They’re able to realise the cost savings they require at each level of government.
However, it’s not that the cloud is a one-stop answer for their problems. They may have challenges associated with how their IT is dispersed: for example, they may have small, centralised business units or public-access areas like libraries. We’re already seeing specialist cloud providers popping up for local governments, and those niche providers are able to solve those problems for local councils.
GTR: How important are telecommunications upgrades for the cloud to reach its e-government potential?
DAN: Telecoms providers have an inherent advantage in that they are able to make that connection between the customer and the cloud computing offering.
That connection is very important for the customers: when we start talking about reliability of clouds, we’re not only talking about the reliability of the cloud-computing platform, but about how important it is to be able to consistently, timely, reliably and securely get to that cloud.
From that point of view, having the opportunity to tailor a particular connectivity between a customer and their service, gets quite important. Also, from the point of view of providing an end-to-end security layer from the customer to their cloud service – which might otherwise be tempered or interfered with – telco players are advantaged against other, more discrete cloud computing offers.
HANRAHAN: The cost model around telecoms links has obviously changed. We’re seeing quite a few requests around departments looking for services that could be accessed via AARNet in preference to the Internet. So, clearly, there are still some drivers to ensure that data access is the most cost-effective means, but it’s no longer the factor that it once was.
GTR: Cloud standards bodies are working diligently to nut out standards for data portability, federation of services, and so on. How much has yet to be done to make these standards government-ready?
HANRAHAN: There is a long way to go here. The challenge, as with anywhere large industry players come together to drive a set of standards, is that it can be a slowish process and often comes out with the lowest common denominator. In the short term, the expectation is that anything in the cloud can be accessed through an open API – but we’re not seeing a lot of demands from clients for our API to be standardised or consistent with another service provider.
There seems to be an expectation that they can build workflow across multiple cloud service providers, and manage risk that way. Over time we’ll see more standards, particularly around how data is encrypted, and moving data between clouds. But at the moment, getting a service that is fit for purpose, and provided the right way, and can be secured and managed, is higher priority than interoperability between clouds.
DAN: For a few years now, there have been many efforts to establish standards in cloud computing by governments, industry groups, vendor alliances and other communities interested in this space. When I last looked, there were at least a dozen such groups all working together to establish best practice, allow for best platforms, and enable the transition across multiple vendors and platforms.
For all this to evolve into standards, those efforts will need to coalesce around a set of best practices, taxonomies, and so on. It’s in the interest of everyone for such standards to be developed ASAP.
Telstra is involved in a number of such groups, with more active participation in some and an advisory role in others. We keep a close eye on the vendor community that we work with, and also keep close interaction with the government and their representation on those bodies. Because we have the perspective of an end user and also of a provider, it gives us a quite balanced and comprehensive role in these things.
BIVIANO: As the industry matures, there will be more standards appearing. I don’t necessarily think you need a lot more high-level standards. You may have some more technical standards appearing, but a number of standards are already adapting themselves to life in the cloud.
PCI, for example: you can still have a cloud policy and have PCI compliance. But you need to be sure that your cloud providers are able to help you keep up with those requirements. – David Braue
The Victorian Government's shared services agency Cenitex and VMware are working to develop...
Google Cloud Platform has secured certification from the Australian Cyber Security Centre (ACSC)...
In our annual Leaders in Technology series, we ask the experts what the year ahead holds. Today...