From digit to digital business transformation

The word digital comes from the same source as the words digit and digitus, the Latin word for finger or toe. Since fingers are often used for counting, in Dutch you can translate the word ‘digital’ to ‘finger-like’, something you can count on. That the word digital refers to fingers as a concept arose around 1938 and became a basic concept for the computer that was under development at that time. After all, the numbers had to be registered electronically – a term for quantity – and ‘digits’ had to be added. Since we humans have 10 fingers, the numbers 0 to 9 are our ‘digits’ or numbers, where each digit consists of one number. In the decimal system derived from this, numbers above 9 consist of several ‘digits’ or numbers, the number 23 consists of two digits, the number 256 of three digits.

Logical calculation

The most basic level of counting is the binary system: 0 or 1. The use of binary numbers dates back to ancient Egypt, but it was the 17th century German philosopher and mathematician Gottfried Wilhelm Leibniz who created today’s binary number system . In 1672, Leibniz built a mechanical calculator that could multiply, divide and root using the binary number system. Although not completely error-free, it was the only known binary calculator for over two hundred years.

Leibniz was also the founder of the so-called logical formalism: converting an argument or argument into a calculation. He developed an objective symbolic language in which to make an argument because our natural language was too subjective for that. Two hundred years later, George Boole (1815 – 1864) developed the first complete logical formalism. Boole was a professor in the Irish city of Cork, and his so-called Boolean algebra is considered to be the foundation of modern digital computer logic. He based his logic on concepts of set theory – intersection, association and complement – ​​for the operators AND (and), OR (or) and NOT (not).

The father of information technology

Boole’s work remained relatively unknown except among a small group of experts who studied logic. About seventy years after Boole’s death, Claude Shannon (1916 – 2001) attended a philosophy lecture where he became acquainted with Boole’s work. He understood that this was a good basis for describing mechanisms and processes in the real world, which was highly relevant in the era of vacuum tube-based calculators. At MIT, he wrote his thesis in 1937 showing how Boolean algebra could be used to design better relay circuits. In retrospect, this is sometimes called the most important and well-known thesis of the 20th century because it formed the basis of our current electronic computers.

In 1948 he wrote ‘A mathematical theory of communication’, in which he solves the problem of reconstructing a signal sent by a transmitter. Also called Shannon’s theorem: given a communication channel, a signal can only contain a certain amount of information. How much bandwidth does a channel need to have in order for you to reconstruct all the signal content from the existing noise at the end. In fact, this publication is the foundation of information theory, and Shannon is considered the father of information technology. In 1949, he wrote another important treatise ‘Communication theory on secret systems’, which laid the foundation for the mathematical theory of cryptography. In fact, double his previous ‘complete reconstruction’ of a transmitted signal. Transmission of a signal that is ‘completely unreadable’ to other parties.

Thinking from mathematics

As Shannon suggested, Boolean algebra provides the ability to describe mechanisms and processes in the real world. Doing this with electronic circuits that calculate in binary requires ‘being able to think from the point of view of mathematics’. This is also the reason why the subjects of electrical engineering, mathematics and computer science are strongly connected, as is the case with my alma mater, the Faculty of Electrical Engineering, Mathematics & Computer Science at TU Delft. Therefore, pure digitization is difficult for non-mathematical people. After all, digitization requires a mathematically logical way of thinking to describe the real world in a virtual way so that it can be programmed in software.

Digitization is digitization of analog and manual processes. To digitize a paper archive, you need both a scanner and a program to make this paper information digitally available. In order to use that software, hardware such as computers and networks are required. In this way you create information in digital form: data, the smallest building blocks of our digital world. The advantage is that computers can process data much more carefully and faster than humans. The typewriter was essentially a ‘word processor that printed instantly’. Digitization is putting software between input and output that can also display, save and send the entered text digitally.

Digital and digitization

Digitization is therefore the conversion of analogue or manual processes so that we can perform them faster, more carefully and repeatably. Digitization in itself does not contribute much to making a process itself more efficient, unless the processes are further automated. The manual work is then done by computers and machines. Automation and digitization go hand in hand, but it also creates island automation: A limited, manual and analog process is replaced by an automatic, digital solution. For example, at the end of the last century automation – and thus digitization – of many processes started. Boxed environments with often specific applications and associated specific data that digitally automate a specific process or workflow from start to finish.

At the beginning of this century, the concept of ‘digital transformation’ emerged. You can see this as an organization’s transition to a business model that uses digital solutions to respond to user and market needs and expectations. To be able to do this, the underlying processes must of course be 100% digitized. Digitization of business processes is a part or condition of a digital transformation, but never the goal. Digital transformation primarily means adjusting or renewing the business model, including the organization. And it is therefore not an ICT project, but a business task.

Digital business transformation

Start-ups start 100% digital and can skip digitization, they work digitally immediately. But existing organizations have the handicap that either the existing analogue, manual processes must still be digitised, or the islands from previous digitization must now work together in a digitizing inner and outer world. The business model requires a user-centric approach, and digitization requires a data-centric approach. All digital processes can then be developed between user and data layers, which form the new business model together or side by side. The challenge is therefore to design business models where both users and data are central.

To give shape to this, company and business architects are needed, who develop, design and specify a feasible and maintainable design of the new business model based on purposes, functions, users and data. Because it is a business issue, it requires digital skills from the directors who (must) lead it. The latter is currently often the biggest challenge to successfully implementing and implementing a digital business transformation. . .

By: Hans Timmerman (photo), Chief Data Officer at DigiCorp Labs and Director of Fortierra

Leave a Comment