This week we are diving into the world of data science. Our specialist data science recruiter, Becky, takes a look at some of the top headlines. Here is the rundown of the most exciting data science news.

Network rail use data science to improve

Network Rail Leveraging Data Science to Speed up Commute

Network Rail’s newly appointed CIO, Aidan Hancock, is on a mission to transform commuting by train through data science. Indeed, he has vast amounts of data to rely on – “10TB of data is captured for every 400 mile stretch, every fortnight.”

Network Rail’s budget for this project is a whopping £969m between 2019 and 2024.

Hancock estimates that Network Rail currently utilises roughly 10% of the data they have available. He wants to kickstart his transformation by boosting several projects with AI and machine learning.

Where does their data come from?

Not only do Rail Network manage the infrastructure of UK’s railways, but they also have mobile field bases. The field bases collect data to ensure that day-to-day operations run smoothly. Leveraging these datasets will help to “fully automate timetabling systems. As well as helping cut down on those dreaded points failures which scupper many a commute.”

Hancock also wants to scale up the Intelligent Infrastructure programme. The programme currently runs only in selected locations in the UK. He plans to invest £200m into it. The project focuses on improving asset management across Network Rail. It will do this through improved collection, integration, interrogation and systemisation of its available data. As well as implementing an integrated approach.

The digital transformation

In his digital transformation, Hancock wants to employ Microsoft platforms, such as Azure, and new hybrid cloud solutions. At the same time, he wants to capitalise on the existing systems and solutions, such as Apple mobile technology. However, it should be noted that Network Rail “hasn’t quite competed the move to Office 365”. In addition to this, Windows 10 is also yet to be rolled out across the organisation.

It is also important to Hancock to develop more agile and innovative ways of solving problems, as well as to employ the right data science talent. As he told Computing:

“With data science, we’re not going to transform overnight. On one end, you’ve got serious experts who live and breathe Python and R and so on, and they really love mountains of data, and I want them working here (…) But all the way at the other end, I’d like to get to a point where almost everybody, if you’re sitting at a desk for your job, should be able to do some data science.”

It’s a truly inclusive approach to digital transformation – we look forward to seeing the results that Aidan Hancock achieves!

DigitalOcean Launch MySQL and Redis Database Services

After introducing their Managed Databases for Postgre SQL earlier this year, DigitalOcean are releasing their Managed Databases for MySQL and Redis!

DigitalOcean are determined to provide fully customisable and scalable solutions. They want to address the most common problems that developers face. These include things like determining the optimal infrastructure, implementing reliable failover processes or setting up a complete and reliable backup and recovery process.

It is important to highlight that DigitalOcean created managed databases with simple interfaces. This allows developers at all levels to work on them. As they said, all you need to do is “select the database engine, storage, vCPU, memory, and standby nodes and we take care of the rest.”

As pointed out by TechCrunch,

“This move exemplifies DigitalOcean’s ambition to move beyond its discount hosting roots to become a more fully fledged cloud provider.”

Shiven Ramji, senior VP of Product points out,

“With the additions of MySQL  and Redis, DigitalOcean now supports three of the most requested database offerings, making it easier for developers to build and run applications, rather than spending time on complex management. The developer is not just the DNA of DigitalOcean, but the reason for much of the company’s success. We must continue to build on this success and support developers with the services they need most on their journey towards simple app development.”

A hand in a glove places a microchip on a chip board

Cerebras Systems Unveil the largest semiconductor chip ever built.

A Silicon Valley startup, Cerebras Systems, presented their Cerebras Wafer Scale Engine. The chip has 1.2 trillion transistors and “57 times bigger than Nvidia’s largest general processing unit, or GPU. It boasts 3,000 times the on-chip memory. Almost all GPUs are silicon workhorses that power many of today’s AI applications, crunching the data needed to train AI models.”

The chip is groundbreaking. According to Cerebras, the traditional way of hooking up small chips together slows down the training of AI models. The Wafer Scale Engine, on the other hand, possesses 400,000 cores that, linked together, speed up the data processing. Plus, it is able to shift data between processing and memory really fast.

Will this invention truly revolutionise AI research? We’re excited to see!

Simon Data raises $30m in Series C

Simon Data, a customer data platform, announced $30 million Series C round of funding. The platform collects data from a variety of marketing tools to get a better understanding of data and activities. It then delivers messages to the customers and triggers actions.

As Jason Davis, the CEO, told TechCrunch:

It’s about taking the data, and then building complex triggers that target the right customer at the right time. This can be in the context of any sort of customer transaction, or any sort of interaction with the business.”

Also, companies are able to see the aggregated data to make marketing decisions.

One way in which Simon Data intend to use their newly raised capital is through employing machine learning to devise refine and complex interactions:

There are a tremendous number of super complex problems we have to solve. Those include core platform or infrastructure, and we also have a tremendous opportunity in front of us on the predictive and data science side as well.”

The Smart Efficient Energy Centre secures £4.6m funding

Smart Efficient Energy Centre (SEEC) is a Welsh data science hub set up through Bangor University. With their newly secured funding, SEEC “will investigate the options for using big data science to improve the efficiency of low carbon energy systems including nuclear, marine and offshore wind energy.”

The organisation has an ambitious vision to become an international hub of excellence in North Wales. As Professor John Healey, Director of Research at the College of Environmental Sciences and Engineering said: “It will lead innovation on how advanced engineering, computer science and modelling can be applied most effectively to tackle grand challenges of increasing the sustainability of energy supply and utilisation, while minimising negative environmental impacts, in particular, net carbon emissions.”


We hope you enjoyed this data science edition of Our Week in Digital. If you are looking for a new job in data science, get in touch with us today! We are always keen to hear from exceptional data specialists!

About the author: As a founder of Ignite Digital Talent, I lead our brilliant team to ensure we deliver time and time again for our clients. I also stay closely networked with industry influencers to ensure we are well placed to understand the issues and challenges our clients face.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

Scroll To Top