At the dawn of the fourth industrial revolution, fears are often expressed that computers will replace human beings. The reality is, in many fields, this is already happening — and rightly so.

Take, for instance, medical diagnostics.
IBM’s artificial intelligence (AI) platform, Watson Health, was tested at the Memorial Sloan Kettering Cancer Center in the United States, the largest facility of its kind in the world, a few years ago.
Watson was found to be capable of predicting lung cancer with greater accuracy than highly trained and experienced radiologists under the same circumstances.

The machine proved able to rapidly analyse massive amounts of data — in this case, two million pages of medical journals, 1.5-million patient records and 600 000 medical findings — and then apply its synthesis to the CT scans shown to it to reach its conclusions. This level of information absorption clearly surpasses human ability.

Watson also made treatment recommendations, but it was not infallible in its diagnoses. Nor could it take ethical responsibility for the difficult decisions that typically follow, such as whether to proceed with a biopsy — which is an invasive procedure and, therefore, not without risk — and then, whether to go ahead with surgery or radiation or chemotherapy, all of which could damage the patient’s health and quality of life more than improve it.

For these complex and nuanced questions, humans are indispensable. No algorithm could replace human judgment. There is no app for human values. However, it would be senseless not to use machines to perform tasks at which they are faster and better than people. For some things, machines simply are the smarter choice — even as a sophisticated tool in skilled human hands.

This is easier said than done, though. One cannot just feed any data into a computer and expect a stream of solutions to be spat out on the other side. There is a useful term for this in computer science — “gigo”, which stands for “garbage in, garbage out”. In other words, if your input is flawed, so will your output be.

The validation of data quality is very important, which can be a huge task in the age of “big data”, where information becomes difficult to handle because of sheer quantity. The generation of data has exploded the past few years. Faster and cheaper computers and smartphones as well as the internet — and lately, the internet of things — have made the gathering, sharing and exploitation of data pervasive in almost all sectors, from finance and commerce to health, bio-sciences, engineering and many others.

Workplaces are also becoming more integrated, with multidisciplinary teams working on shared problems. This is particularly true of activities involving data and computing — and nowadays this is nearly everything, because data is being collected and analysed in almost all fields.

It is our task as universities to prepare our students for the new world of work where the discovery of useful knowledge from data will become integral to most of what they do in future. Furthermore, big data and AI will fundamentally change not only the world of work, but also our world as knowledge workers at universities.

Now, it has been said that it is easier to change the course of history than it is to change a history course. But the world is changing fast, and universities have to adapt. For centuries, universities have been organised along distinct disciplinary lines. But data science and computational thinking exceed the boundaries of traditional academic fields of study and look set to revolutionise current academic and professional paradigms.

Academic Jeffrey Buller pointed out in 2015 that: “The choice in higher education today isn’t whether we should change but how … change is already here. The issue is what we’re going to do about it.”

At Stellenbosch University, we responded to this challenge by adopting new rules on academic entities within and alongside departments and faculties. This has paved the way for establishing our new School for Data Science and Computational Thinking, which will be launched at the Stellenbosch Institute for Advanced Study on July 29.

The Stellenbosch University School for Data Science and Computational Thinking is a game-changer in higher education, both in South Africa and beyond. It will work across all 10 of the university’s faculties and will span the entire academic project – from under- and postgraduate training to research and specialist consultation. It will also support the private and public sectors as a trusted and respected partner in and for Africa, catering not only for full-time students but also offering online modules to professionals looking for new methods and best practices to use in their work.

Writing about the challenges facing higher education, Clayton Christensen and Henry Eyring made the observation a few years ago that “The technologies that now threaten to disrupt traditional universities … can also reinvigorate them to the benefit of so many people.” I am confident that this is exactly what we have managed to do with the new Stellenbosch University School for Data Science and Computational Thinking.

Professor Wim de Villiers, a gastroenterologist by profession, is rector and vice-chancellor of Stellenbosch University, and currently also vice-chair of Universities South Africa