Fremantle Ports gets analytics into ship shape


By GovTechReview Staff
Wednesday, 17 April, 2013


Every environment has its own pressures, but the logistics necessary to co-ordinate the annual movement of $25.9 billion worth of seaborne goods are a special kind of challenge for Fremantle Ports.

In managing 80 percent of all imports to Western Australia, Fremantle Ports – a WA government trading enterprise – coordinates a steady flow of massive container ships that supply Perth and its surrounds with a broad range of imported products. It is Australia’s fourth-largest container port and a major conduit for livestock and raw-materials exports.

With over $70m worth of stock passing through the port, on average, on any given day, tracking what can be dozens of corresponding ship movements is a major exercise. For many years, that challenge had been met by staff using Microsoft Excel spreadsheets – more than 150 of them, according to business systems consultant Collins Vuchocho. As often happens, heavy spreadsheeting posed limitations: data was landlocked, inconsistent and utilised in different ways from one person to another. Reporting was difficult and invariably retrospective – making it all but useless in helping logistics specialists plan the movement of goods. While an earlier implementation of IBM’s Cognos 7 business intelligence (BI) system had produced some curiosity and was used by a few employees, the majority had rejected the platform and still relied on familiar spreadsheets for daily activities.

Data mining, and the other kind. Although the port was functioning, inconsistencies in master data management and reporting left managers frustrated at their lack of visibility into the day-to-day operations. Getting details of a ship’s manifest, for example, might involve a lodgement with the ship’s agent and a two-day delay – by which time the ship had docked and Fremantle Ports had missed the opportunity to provide appropriate support, such as the provision of a hazardous materials response team for certain cargoes.

Lack of real-time information was not only preventing Fremantle Ports from supporting incoming ships properly; it was also threatening the port’s ability to keep up with the demands of Western Australia’s increasingly mining-driven economy. Real-time data was crucial to co- ordinate the movement of mining products like iron ore, which arrived by train and had to be quickly loaded onto ships for overseas transport.

With mining trains often measuring several kilometres long, port staff had to work to tight windows to bring those trains in, unload them, and get them out in time for the next arrival. Missing that window could strand the train on the port’s rail lines for days – creating a significant and expensive knock-on effect across the rail network.

“The challenge of servicing our customers became a lot more, because a failure meant we wouldn’t meet our service level agreements,” says Vuchocho. “We needed a platform that could analyse our data quickly, and let managers analyse it in real time. This had put a lot of strain on our previous systems, and our needs had changed from reporting once a week or once a month; managers wanted to know what was happening in real time, all the time.”

Old analytics, new attitudes. As the Western Australian economy converged around the mining industry in the wake of the global financial crisis, it became clear that Fremantle Ports needed to modernise its administrative systems. When it invested in a new ports management system, it also began considering options for an improved BI environment.

“Over the years it had gotten to the point where we couldn’t keep it going,” Vuchocho explains. “We were looking at moving to the next stage from a technological standpoint, but because the new port management system was such a high-pressure project, the requirements kept changing. By the time we actually got to looking at our back end, it

became clear that we had to look at our data warehouse or we were going to continue on with the mistakes we were making.”

Although an upgrade from Cognos 7 to the newer Cognos 10 version might have seemed like the obvious choice, staff had developed such antipathy towards the previous environment that there was a strong internal opposition to such a move. Many decried the “absolutely horrendous” security of Cognos 7, while others had fallen back on spreadsheets after creating OLAP (Online Analytical Processing) data ‘cubes’ that had fallen into disuse over time.

 "

By the time we actually got to looking at our back end, it became clear that we had to look at our data warehouse or we were going to continue on with the mistakes we were making.

The IT team began considering other options, but kept returning to the newer Cognos platform after pilot testing showed it offered broad flexibility in data manipulation and reporting. Managers grudgingly began to warm to the new dashboard-based environment. “For the first time we had the ability to see the current data we had, but to have very different views of the data,” Vuchocho explains. “That gave us the functionality look at all the data that was coming in, and to create quite different relationships between that data. Because of that, we were able to very quickly build new cubes and provide the business with the data they wanted. That was key for us.”

A whole new world. Pushing the implementation into the business was an entirely different issue, however. In order to bypass the hostility towards the Cognos platform, the IT team decided the best approach to the migration was to simply start over as if Cognos 7 had never existed. “We discovered we were so far behind that we just couldn’t do a straight migration,” Vuchocho recalls. Over the course of eight months, the organisation’s data warehouse was designed and built from the ground up – leveraging Microsoft Windows Server 2008 and SQL Server 2008 – and the team carefully vetted what were previously 77 Cognos management reports. That number was cut to around 30 after eliminating redundant functionality and implementing process optimisation. This simplicity, combined with the more user-friendly interface and ad hoc analytics of the new platform, helped win over users that were still wedded to their Excel spreadsheets."For the first time we had the ability to see the current data we had, but to have very different views of the data. That gave us the functionality look at all the data that was coming in, and to create quite different relationships between that data. Because of that, we were able to very quickly build new cubes and provide the business with the data they wanted."

Weaning users wasn’t always easy: “Some users were very used to their data and still wanted their spreadsheets,” Vuchocho said, “because they had been created quickly and suited their needs at the time. We still made the old Cognos reports available as we were undertaking the migration – but as the new environment got into the business, users realised that it made it easier for them to pull their data.”

Extensive and ongoing user training, run with the support of IBM, put more wins on the board as the new Cognos platform provided the real-time visibility employees had been demanding.Cognos proved particularly popular because it could provide monthly baseline figures by which managers could chart day-to-day trends – helping them get new views of the business. The initial base of 30 Cognos reports has swelled to more than 80 over time, as the new data warehouse and Cognos analytics platform have asserted themselves as the new de facto management platform. And this time, Vuchocho says, users are still aboard.

“This system has given us the ability to look at all this information on these different systems, and have it in one place where it’s easy to manage, analyse and share,” says Vuchocho. “We’re now exploring new functionality and looking at new activity reports to see how we can leverage that with the business. People have so much freedom, and have discovered they can drill down to more detail – and they’re now asking for a little bit more.” – David Braue 

This case study originally ran in the April/May 2012 issue of GTR.

Related Articles

Automated decision-making systems: ensuring transparency

Ensuring transparency is essential in government decision-making when using AI and automated...

Interview: Ryan van Leent, SAP Global Public Services

In our annual Leaders in Technology series, we ask the experts what the year ahead holds. Today...

AI in health care: the burning question that will only be answered with time

We are at an exciting juncture in our global healthcare journey, and AI’s arrival and...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd