Nature, and all members of the ecosystems have created an amazing variety of communication systems, both on land and under water. It is easy to be impressed since the speed, density and social commitment goes beyond what humans would consider. In this three part article we first explore how nature communicates, second we study the parallel between data and petroleum, and finally we discuss how a new generation of internet could look like: inspired by nature.
An internet model inspired by Nature
1. An economic growth and job generation model inspired by Nature
The world is trying to shrug off the economic and mental damage from a virus. There is a need to revive the economy from the bottom up. We know that data is the new petroleum and therefore, as part of a strategy to kickstart the economy we could use “data power” to provide the welcome and needed source of income. However, the main challenge is that nearly all revenue generated by data is earned by only a dozen players worldwide. The present structure of the Internet and the high concentration of earnings implies that the marked increase in internet use is “making the rich richer”. Worse, we are loosing an opportunity to democratize the internet, both its access and its income model. An estimated 50% of the world population is excluded from the full benefits offered by the internet, and the other 50% gets exploited over the internet through a business concept known as datamining which has no respect for privacy and rather resembles a modern day form of slavery and dependency forcing billions into polarized opinions.
In this article I propose that we first of all inspire ourselves in how nature communicates, before we enter into the details of the human-designed and centered internet. Could we emulate this surprisingly efficient system of exchanging data and trading information? If yes, then chats on social networks could resemble the trunks and branches of trees, supported by strong roots, strengthened by mycorrhizal fungi that ensure the connection of the local networks into a coherent system. We could establish secure connections of one pod (area) with another, while each has its own rules for data gathering and sharing, based on universal ethics engrained in human rights. Leaves and undergrowth will nourish the soil, provide the energy that makes microorganisms perform, combining the unique contributions of each of the five kingdoms of nature (bacteria, algae, fungi, plants and animals). Everyone is always learning and continuously inspired, finding pathways to improve performance and resilience, finding joy in the process of discovery. Once we grasp how Nature has evolved a data collection and communication system, can we then imagine a political and technical framework to redesign the internet based on Nature? I believe we can.
All information of a forest circulates within the local ecosystem. Few of us realize Nature, this amalgam of ecosystems operates the greatest communication network ever. It is self-sufficient in energy, and has been operating for millennia. While we may be impressed by the exploits of 5G or the 30,000 satellites that Mr. Elon Musk wishes to put in orbit, Nature’s information and communication system is proven to be much more performant and resilient. Surprisingly most of it operates just right underneath our feet, or in the vast water bodies of our oceans. This network is hidden from our eyes while it operates as an interface between plants, fungi, soil rich in microorganisms, and the soups of virus and bacteria, blended with plankton and micro algae in the seas. These systems work all the time and trade 24 hours a day with amazing stability, and diversity of communication methods. Its ultimate purpose is to take care of everyone, even considering the specific needs of the young and elderly, leaving space for new life and even new forms of life to emerge, like we have never imagined (and practiced) without our human-centered world. To our surprise, this highly performant internet originates in the seas since the beginning of time, and on land well before the Cambrian Revolution 450 million years ago.
The land-based underground network has received most attention - even though still very much in the margin of science - but has been wittingly called the “wood-wide-web’. This network connects and communicates intensively about available resources and imminent threats, balance of nutrients and density of water. The physical connection provided by these fungal networks is thinner than a silken thread, while their length can measure one kilometer in just 1 gram of soil. This is mind blowing. The amount of connectivity that is right under our feet is billions of times more intense than the one million connections of Internet of Things (IoT) per square kilometer that data mining companies are projecting. The fungi, who operate the core of this communication system exchanges and processes all information in return for sugars and fats. Ultimately, the responsibility of this “www” is to promote life. Can we ever emulate this level of sophistication?
The interspecies communications in the sea has reached a level of sophistication that has been left unstudied, thus it is not understood at all. Since the density of water is dozen times higher than soil, and nearly 800 times more than air, we can only imagine how the exchange of information has evolved in water bodies. While research is still in its infant stage there is an emerging consensus that information is exchanged and messages are swapped through the biochemical changes in the water caused by metabolism or anxiety; the electromagnetic shifts emanating from movements of organs; the subtle perception of changes in sound from everything but especially the heart beat which is audible over long distances through highly sophisticated filtration techniques permitting to zero in on that particular sign of life; to calls with perfectly pitched sounds; or, even the careful drawings that would resemble highly expressive art and the play with colors. All and more yet to be discovered are part of this diverse portfolio of communications.
1.1. From Anthropocene to Fungi-cene
Many modern day citizens think we created the Anthropocene times, where life is determined and dominated by humans. When our visionary computer scientists refer to the intelligence embedded in nature they usually point out the neural networks. True, it is easy to be impressed by the capacity of our brain. However, if you really want to be impressed, then spend a few hours exploring the magic of fungal networks which are nothing less than the largest living organisms in the world (100 square kilometers of filaments of mushrooms going through the soil all having the same DNA). And then take the time to explore the power of precision of data transmission through pheromones; or, study the varying content of whale songs and the diverse languages mastered by the sperm whales. Their capacity to express precise instructions and opinions in the same structure as language but with different syllables according to the oceans where they roam has baffled scientists.
We can only be self-confident about “our” internet and “our” brain as long as we do not realize how viruses, bacteria, fungus and micro-algae organize life, just like corals, sharks, and pufferfish. While we tend to observe our role in the destruction of ecosystems, habitats, animal and plant life, we seldom look at the burgeoning of Nature's internet in the soil, water and air. Just to illustrate: pheromones - these tiny molecules - pass very selective messages with very high precision of one part per billion through the air to reach a specific (future) partner or family member miles away. The Delicea pulchra, a red seaweed living in the Tasman Sea, which could be interpreted as a soup of microorganisms with one million virus and bacteria per cubic millimeter of salt water, has the capacity to jam the communication system of bacteria preventing the creation of biofilms and the spreading of viruses. Whales exploit salt gradients to communicate with their kin over thousands of kilometers in the oceans. These vast networks of communication deployed in soil, air and water have one main purpose: promote life and build up resilience. So, we will discover that the role of fungi in nature is much more than to degrade and decompose plant matter. Fungi could be viewed as the masters of this Nature's internet connecting on land everyone and communicating everything, while securing fairness in the process. This social element that is part and parcel of the fungi culture is new, and not expected from a world that has been described as “wild to be tamed”.
1.2. The Plant and Fungi Data
The reset we need for the internet controlled by a few players could be inspired by the plant and fungal networks as to permit an intense exchange of information. In Nature, all data is locally managed, bringing benefits to everyone. Information for what? Let us summarize the main content of the exchanges as science has understood it to date: fungi sense and register the presence and the amount of carbon. Based on their proprietary data - since there is no one else with the means to collect with great precision this information - fungi determine the price of sugar. This is not a cold calculation and price setting based on supply and demand. The fungi assess the value of an old tree, and its role in the ecosystem, while ensuring the connection with the hundred year old stump with its offsprings in the area. Based on this detailed set of observations driven by a search for a quality of life - not the maximum revenue - the fungus and the plants engage in trading whereby fungi deliver mainly carbon. The fungi ultimately decide how much plants and which plants pay what price expressed in an amount of sugar and fats.
We discovered through recent experimental field research that the price for carbon is not uniform! Take the example of an area where human activity has mined most of the carbon out of the soil: plants have to pay a higher price for acquiring carbon and some other trace minerals that are desperately needed. This higher cost limits their growth. However, this extra income for the fungi is partly redistributed to the needy, and partly used to rebuild the fungal network which was regularly destroyed by tilling. The fungus has a culture of taxing and redistribution of wealth. Could we imagine a human designed internet were we are thoroughly connected throughout a local area, and we would be able to exchange data and provide livelihood and robustness in a system that regenerates the soil as suggested above?
Nature creates and operates information networks that guide a market with fluctuating prices for carbon. In additional, fungi will engage in partnerships that take care of basic needs beyond food for all members of the ecosystem. The fungal-flora network, supported by microorganisms including bacteria and virus, considers the young and the old, with their specific needs and secures information and nutrition that is supportive to all life since the fungi learned over millions of years that the young, old and weak share the wisdom and the knowledge that go beyond the mere utilitarian goals of trade. This approach permits all present in the local system to share, learn and benefit . This strengthens the whole.
With sugar as the currency, and carbon as the main commodity, this trading system builds interrelations that turn into the building blocks of biodiversity. The services offered by this communications network extends to health and safety measures, even to emergency calls. Take the example of an approaching herd of elephants. These giant mammals rip off branches and leaves to still their grand appetite. This is painful for trees. Once the fungal network senses the arrival of the herd thanks to its pounding on the ground which reverberates through the region, it will quickly announce this imminent danger through its underground communication system offering a sense of direction and timing. It is like a remotely controlled burglary alarm, it even extends to a health and safety insurance, including preventive measures to reduce possible damage. By announcing the arrival of elephants with a ferocious appetite, plants will quickly shoot their leaves with plenty of acids, aided by the fungal network. This rapid exchange whereby liters of fresh chemicals are pumped through the tree’s twigs and leaves is only possible thanks to the data collection, communication and trading power of the fungi. The elephants taste the bitter leaves and quickly move on to avoid an upset stomach.
We are just at the very beginning of our discovery of how nature and its communication systems work. And we realize that have barely scratched the surface of underwater communications. It is now clear that the soil with fungi and flora is wired with ramifications beyond our comprehension. This newly discovered resilience in ecosystems permits local communities to thrive, strengthen the basis of life as we know it. This incredible reality beneath our feet and in the water offers us a chance to become humble instead of the -often arrogant- belief that we have computing and communication systems that are modern and unique. Reality is that compared to what is happening in the soil and sea even our satellite connections and our internet data exchange is a poor approximation of what Nature does.
The exchange of information at the core of these networks is local, all data is locally processed, stored, shared, learned from and acted upon? This inspires me to suggest how to evolve from datamining to data farming and data agroforestry. But before we enter into proposals, let us first study the economic power behind these proposals by drawing a parallel between data and petroleum.
2. Data: The New Petroleum of the Economy
When in late December 2020 the communication medium WhatsApp announced its new terms of service, imposing a novel policy about sharing private data with Fabebook, its controlling shareholder, tens of millions of users shifted in a matter of days to the alternative services of Telegram and Signal. The rush to the alternative messaging system spiked overnight since WhatsApp imposed a very tight deadline on accepting these new terms. While WhatsApp is not sharing content with anyone, which is claims is fully encrypted to the point that even the service provider cannot access the content, the app has announced that it will share all metadata: who sends a message to whom, when, and from where. WhatsApp openly says that its business model uses data related to its clients for profit. This is in contrast with the statement made in 2014 by Facebook when it took over WhatsApp: Facebook would not collect data from WhatsApp. However, as we now know, WhatsApp offers data to its mother company! There is always a fine line separating reality from intention and that is because “data is the new petroleum”.
Digital services, ranging from network infrastructure providers (like ISPs) to hosting services (like cloud storage providers), and online platforms (like social media and marketplaces) all consider that their future lays with data mining. Data mining is the business of getting privileged and even exclusive access to a rich and unlimited flow of revenue derived from exploiting data. The European Union with 450 million inhabitants has tried to impose strict rules on data collection and processing of personal information including profiling. Unfortunately, its good intentions have not resulted in a successful enforcement of its policies. The United States of America (330 million inhabitants) has been rather relaxed on the data mining and the privacy rules. The American Administration has tilted its policies towards a break-up of the technology empires that control this lucrative and powerful business. Whatever both regulators intend to undertake, only a small and very exclusive number of data miners today exploit the masses of information for commercial, political and strategic purposes. Also here we notice a grand difference between reality and intention … because data is the new petroleum.
The novel business of datamining is not much different from petroleum. Both data and crude oil require a central control to design, process and distribute products to clients. The petroleum industry - and the present format of datamining - is an antithesis of the natural model of information, communication and community service we have just described in the first part. However, the comparison between the two helps define clearly what is happening, what is needed and what has to be avoided. There are always different angles to study the same industry. For example: fossil fuels contribute largely to climate change, while a secure and abundant access to petroleum became the grease of the economy, without which everything would come to a grinding halt, or could never have taken off.
Petroleum, as a strategic commodity that powered growth enjoyed a century long expansion. Its influence goes beyond energy. Derivatives of crude oil include asphalt for roads, plastics and packaging, fertilizers and even microplastics for cosmetics. We could state that the petroleum industry could be considered as one of the most efficient users of raw materials: nearly nothing is lost. Data miners are looking for the same centralized approach to their business: all possible information is gathered, stored and processed ready to be sold.
Since the advent of the petroleum industry, it adopted a centralized command and control model. This is also true for datamining where a few powerful businesses dominate. This bias towards power in the hands of a few with disregard for for the local economy, citizen rights, especially privacy pushes policy makers to rewrite the rules of the game without much success. Worse, as each political authority creates its own set of rules they are easily manipulated by the powerful few that master the market dynamics. Numerous attacks on the centralization of power both the datamining and the petroleum powerhouses have been to no avail so both continue to thrive and expand their cloud.
Their business model emulates a simple logic: “more of the same” as prescribed by the economic model of globalization, standardization and economies of scale where the cutting of costs stands central. Shell demonstrated this with an amazing confidence by inaugurating in Pittsburgh, Pennsylvania (USA) the largest polymer factory in history crunching out one million tons of plastics a year. The pellets are distributed by more than 3,000 rail carriages to be turned into phone cases, auto parts and food packaging. Everyone knows that microplastics emanating from these plastic products will be lingering in the air, land and seas long after they have served their purpose. However, the projected growth in sales over the next few decades is so solid that it translates in more than a dozen additional facilities that will be built by Exxon Mobil, Dow and a handful players that dominate the market. What was and is true for petroleum seems even more true for data which is thriving on the massive expansion of communication networks and server parks
. Will it still be true in the future? That is up to us to decide.
2.1. Consider a Better Roadmap
The purpose of this article is not to criticise petroleum or data mining companies, and certainly not to demonize sectors that have brought a lot of development to societies and quality of life to individuals. Rather than arguing in favor or against any business we suggest in the next pages how we can design a different business model for data that serves the citizens of the world. Policy makers must know that there are choices to be made, and that if they do not stand by their promises, a new generation could take initiatives that transform the reality the world. Instead of the present roadmap for data which has an ever greater concentration of power than we witnessed for petroleum, we must undertake concerted efforts to create opportunities that will change not just the datamining industry, but the livelihood of the people it affects.
We must make a reflection on the importance of the consumption model that has been adopted for both petroleum and data. The energy model was carefully designed so that everyone would need a generous supply of fossil fuel for the rest of one’s life. The analogy is amazing: no one can imagine that anything would work without data, without communication networks and server parks.
2.2. The Merger of Computer and Communications (C&C)
Data is created and collected at the nexus of computer data and communication technologies (C&C). The owners of the early technology companies realized that the real money is not earned by moving boxes containing phones and computers with ever changing technologies and ever increasing investment costs. Real money is earned by handling the data itself. Data mining businesses designed a legal framework that ensures that all data from the smallest source like a sensor or a mobile phone would be forcibly channeled to server parks which emerged like mushrooms around the globe mirroring many times over all data as it is accumulated and stored. While Governments would be asked to foot the bill for fiber optic networks and satellite communications, few realized that this investment was only to serve a few who knew that data mining would dominate a digital world.
Steve Jobs and Steve Wozniak both realized that a desktop computer in every home connected to a phone line would gather much more precise and detailed data than the supercomputers in the basement of headquarters of Wang or DEC, companies that do not exist anymore. Hence, the quest was on to identify the ways and means of creating and capturing data, securing exclusive access, building of data highways including fiber optic cables crossing oceans and continents. The pioneers designed a portfolio of devices equipped with software that permits the effortless capture and the exclusive control of all metadata transiting computers, portable devices and phones.
In the same year that Apple was created (1976) Dr. Koji Kobayashi, the CEO and later Chairman of NEC, the Japanese computer giant coined the term C&C (computers and communications). He recognized the tremendous value in the merger of computer technologies and knew all too well that this would lead to an explosive demand for hardware, integrating previously two separate pillars of industries into one powerful new business conglomerate. However, he and many others only saw the hardware and never grasped that long strategic vision that underpinned the strategy of both Apple and Microsoft. Once computers and communications were connected, and a wide range of data transmission through ethernet, ADSL, T-1 lines and dial-up systems were established moving data through information pipelines it was Steve Jobs who took the whole logic to a different level with the ingenious creation of the iPhone.
2.3. Services and Commissions through Miniaturization
The strategy that unfolded has the stroke of a genius and impacts our connected world for decades. It has transformed our lifestyle. Many tried to imitate the business model of the iPhone, and while a few like Samsung succeeded surpassing the number of phones sold, no one has succeeded in turning as profitable, valuable and powerful. The revolution of the iPhone was building on the creation of the highly intelligent and mobile services (HIMS), a term that was first coined in 1980 by Nomura Research Institute, the research arm of the largest investment broker of Japan. The success of the “mobile” was more than the capacity to transport phones and computers. It was miniaturization as a fundamental third technological breakthrough that made the HIMS evolve into a phone with a very powerful computing capacity. The amount of devices, technologies, software and services cramped into a phone could only evolve as fast as it did thanks to the ever diminishing size of cameras, lights, sensors, supported by ever more performing chips, and the standardization of the SIM-card.
The master stroke of Apple was not just the endless services included in the iPhone, the real breakthrough is that this phone emerged as a must-have item as much desired as the Louis Vuitton bag for ladies. This smart phone was built on the business model that successfully negotiated a commission (and loyalty) on data volume: the mobile operators agreed to pay Apple a percentage of their earnings generated thanks to the ever increasing volume of data transiting the iPhone. At first most mobile operators resisted this commission. The first mobile phone company that cracked was ATT. As a late-comer to mobile telephones ATT bet on the success of the iPhone and found the high commission estimated at more than ten percent a fair price to pay. It paid off for both ATT and Apple.
2.4. Expand the Portfolio of Services and the Integration of Devices
The drive towards miniaturization with ever smaller functional devices integrated into a hand held phone and a three way loyalty (Apple, network provider and user) was further reinforced through a novel strategy of Apple to invite creative developers to imagine all possible types of apps that would exclusively operate on the iPhone thanks to a dramatic increase of functionality. This was not just a service like forecasting the weather in each city, it was the integration of dozen of pieces of equipment that used to be stand-alone into an integrated platform that was now soon called a smartphone. From an alarm clock and a stop watch to a high performing camera, even an electrocardiogram and a program to predict the precise time of ovulation, to moon cycles with full vision of astronomy, astrology, a flash light and a GPS. All these individual devices were bundled into one and only iPhone. This meant that Apple locked in the users to use more datatime and ensured the transit of an ever increasing amount of details about the daily whereabouts, professional needs and personal interests beyond work. Thus emerged a whole new culture around Apple. This approach locked in the users to Apple, and locked in the mobile telephony companies who enjoyed rapid growth who witnessed an exponential growth in the volume of data transiting their networks. Even voice left the telephone waves and converted into digital soundbites with the arrival of Skype. As the customer base grew, and the data flows turned into tsunamis, fortunes were made overnight. It was no surprise that mobile telephony operators offered phones nearly for free, provided the client would commit to operate exclusively with one operator. This business model resulted in customers exploiting the low rates within the network to acquire several phones. Instead of one phone in the house, within a decade, each member would have from a very young age their smart phone connected to dozens of IoTs and take this device everywhere. Imagine how unknowingly we offered anyone who provided us the network, the hand-held device or the applications the opportunity to invade our privacy.
The solid and ever rising revenues linked to an ever growing suite of digital services ensured a continuous and ever more detailed flow of data: this is nothing less than a well planned bonanza and the materialization of a vision. The continuous improvement of iPhones with the integration of new connectivity technologies switching quickly from one to the other upgraded continuously the streaming of data soon including high resolution pictures and even videos. Then another master stroke was the invention of iMusic, followed by Apple TV and a myriad of online games now good for more than 30% of the business. We often forget the sheer numbers; when iPhone offered a phone version of the game Tetris, fans of the game downloaded over 100 million times in a matter of weeks. Before anyone realized, and complemented by the novel developments with Pixar, Apple would bypass Disney, Universal, Vivendi providing access to music, film and entertainment of all sorts backed-up by an estimated 2.2 million apps. Just think about the massive pool of data this represents for Apple and its privileged partners. The C&C of computers and communications evolved to the C&C (exclusive) content and (total) control.
Then competition geared up and everyone learned from Apple’s success. Google Play bypassed Apple with 2.8 million apps but has not succeeded in copying the revenue model. Google focused its search engine - its original grip on the market with tremendous data collection, and then providing Google maps, and Google translate. This was powerful, and soon the two giants lived next to each other. The role and grip on the market by Huawei and its smaller but design savvy Oppo from China carefully emulated these strategies, adapting to a Chinese context. Samsung outshone its local competitor LG and focussed first on unit volume and then started aiming for the upper market. The amount of detail of data that can be gathered through these companies operating globally represents trillions of data transmissions per day.
Since Apple iOS and Google’s Android put the smart phones at the center of IoT (the internet of things), and offer the Cloud to store data and subscription services where a few can freely mine everything, one can only imagine the unparalleled opportunities associated with mega data. If one adds the likes of Facebook, Amazon, Tencent and Alibaba then we have listed the magnificent ten. Datamining is not just the petroleum of the future, it is the guaranteed cash flow for these (and a few other) technology giants. Every time we click on the cookies button, we hand over everything about us without ever considering the implications of trading free emails, free GPS, free translations, free weather forecast and free social media with the total loss of privacy and the hand-over of a trillion dollar revenue model.
2.5. It is only a beginning
In order for this strategy to maintain its momentum, there is a need to continuously improve the connectivity of mobile devices and take over ever more storage at central points. The Cloud is composed of millions of servers that operate like grand data homes in heaven. At the same time that many talk about the marvel of having everyone’s information in the Cloud this set-up is emerging as the biggest energy guzzler in modern history and growing. It is against this background the Cloud on one hand, and the energyvore system that has been deployed that we have to understand this massive push to get 5G in place: it permits the datamining industry to fill their reserves in the Cloud at rapid and fully controlled growth rates. 5G is not only a next level of datamining, it is an improved version of communications that avoids interferences, thus easing the flow of data from the most intense data users ready to pay a premium price to very selective hosts.
While in the beginning of the mobile phone, dropped calls and seamless connectivity was an issue that caused a lot of customer ire, now it has radio wave interferences and the permanent risks of hacking. The new communication technology made a major effort to strengthen the unencumbered point to point connectivity. This is necessary when the number of radio-based connections to phones on one square kilometer soon reaches the milestone of one million. Even though this is only a minute fraction of the density of the fungal network we described, this airborne communication system enters its next technical and performance challenge. The more connections per square kilometer, the higher the risk of one connection disturbing the other. Worse, the more unprotected connections with a very weak security protocol, the more hacking will take place. In addition to the datamining by the top 10, there is data theft, extortion and even datakidnapping planned and executed by millions of small gangs (and sometimes even political pundits). The users often neglect that the wireless communications were selected for ease of use, not for their secure connectivity.
This increased level of complexity of mobile communications has many side-effects. First of all, instead of roaming around, the new 5G antennas can only perform when these are in the line of sight with the device that must be connected. This novel technical requirement reinforces the digital divide between the heavily populated megalopolis and the thinly inhabited rural zones. In addition, this ever increasing complexity of a high concentration of users with ever more data transmissions over not more than 1,000 different radio frequencies explains why the industry does not want to go too fast with their expansion nor with too fast data transmission.
This may sound as a contradictory statement knowing the expectations of the market to have seamless, faster and ever more accessible communications. However, there is a hard logic. If speed of data measured in bits per second (bps) were a real top priority then the industry would embrace better performing technologies like LiFi, the internet over millions of light frequencies. The deployment of a large number of 5G networks supported by millions of newly placed antennas can barely offer one Gigabit of data volume. This is a fraction of the Terabits LiFi can be operating at. Why are the giants of the digital world and datamining reluctant to shift to better performing data transmission systems that is free and abundant?
2.6. It is Control over Data - Stupid!
Technically speaking the amount of data throughput could increase hundred fold, even thousand fold shifting the transmission technology from the present radio waves based on Bluetooth, satellites, 4 and 5G, to data transmission and geolocalization over light. Why does the industry not embrace the grand advantage that the light infrastructure is the largest and the cheapest in the world. If data miners were to opt for this cheaper, faster and health conscious connectivity, then there would be no need to undertake billions in capital investment to install antennas which face an increasing resistance from citizens worried about the electromagnetic frequencies.
The single reason why the datamining industries decided to stick to radio waves is that these can be tightly controlled by the few who have carved out their market in datamining. While everyone recognizes that radio waves are not at all secure, and are easily hacked, the datamining industry concluded that their control of their sources of information is more important than to provide a secure high speed connectivity. The light based transmission at super speed is uncontrollable, worse for the big operators it invites too many small competitors to nibble market share away, even potentially undermine the hegemony of the data control!
The interest of the consumer (speed, volume, protection from hackers) or the democratization of the internet (making access available to everyone) is not the main preoccupation of the industry. While the industry and the political supporters claim that this is important, we have concluded that after greenwashing there is something new called “datawashing”! This is why this article suggests that there is an urgent need for academia, policy makers and citizen groups to urge a shift in the focus from a narrow debate on 5G, to a broad debate on datamining. Unless we offer a solid alternative to datamining and its C&C infrastructure (exclusive content and total control) we will continue to be locked in a system that serves few, and consider each one of us as a data object as the term “The Internet of Things” clearly suggests. Would it not be more appropriate to design the internet of and for people?
Why does the datamining industry refuse to embrace a better and faster technology? The risk to loose control is well founded. Why? Light has a billion frequencies, and radio waves only have a thousand. This implies that when the standard of data capture and transmission were to use in parallel a billion light frequencies, then there is no chance to keep datamining exclusive. A Government can sell a few licenses for available radio frequencies at a high price and to stem budget deficits, while offering an exclusive access that ensures control of datamining for a few companies. However, the unregulated and free light frequencies are accessible to everyone! There is no chance to ask anyone to pay for using it. Remember, light frequencies are unregulated, while radio waves are highly regulated. Now you realize why the drive to expand the 5G network is so forceful: it is a race to widen the exclusive area for exploiting the petroleum and the lubricants of the economy of today and tomorrow: your data. Any effort to put a dent into this unrelenting drive will be bulldozed over.
Now that this control issue is clarified, we should take for a moment the position of the data provider. There are few credible initiatives to date that aim to protect privacy, distribute power and share the benefits of datamining with a wider audience. One of these exceptions is Sir Tim Berners-Lee, who is credited with the invention of the WorldWideWeb. He announced in 2018 his commitment to build a fairer and more decentralized internet. Finally, he created a dataplatform called Solid. The goal is to give internet users more control over how all their personal data moves and gets used across the internet.
Another initiative is called Holochain, founded by Arthur Brock and Eric Harris-Braun. They have similar objectives as Sir Tim. Their Holo project was created to bring the benefits of distributed apps to anyone with a web browser, redistributing the wealth and health made possible by our mounting interconnectivity. Both initiatives act in favor of a new form of internet concluding that concerted and open source action in their new generation of cloud is more effective since individual users, watchdogs of national governments and even the European Union have ultimately very little power to change the total control by a few. There is a need for an entrepreneurial approach to datamining that complements the highly necessary regulatory initiatives.
Now, the Solid and Holochain platform is not only about privacy, it is about the capacity to decide whereto your information goes and what do you permit that is done with it. While this is a very valid contribution to opening up of the market of data collection, storage and mining, we must point to the fact that Solid and Holochain’s clients are large corporations and national services. While we welcome alternative options like these we realize that this only softens the hard edges of datamining. We need to have a more fundamental shift in the business model that is underpinning the present drive towards datamining. That is the purpose of the last part of this article.
3. Shifting from Data Mining to Data Farming and the Internet of Life
The introduction of an internet based on light represents an exceptional opportunity to launch a completely new model: “Sustainable Data Farming”. This concept offers a chance to reverse “big datamining” which exploits every single piece of data with total disregard of dignity or privacy of the data provider who unknowingly signs off on a total hand-over of every detail of his/her life for analysis and resale in return for a free search engine, email address, or social media account. If there is a political goodwill that has the individual as a priority, and the entrepreneurial spirit as a guidance, then the existing light infrastructure can evolve into the core platform for data transmission. If this logic is pursued then we can cut energy consumption at least ten fold, democratize the internet, and we will turn the table on the existing powers. Just like the statement made about petroleum: we are not against, rather we are in favor of better. At the center stands the concept of “data farming”.
Just like the land is fertile since microorganisms, mushrooms, mosses, insects and plants interact with water, air and soil, data farming offers a rich understanding of the interactions within a family and a community. This content accompanied with sense and sensitivity will never be matched by IoT. This data is first and foremost local and can be collected into “pods” as Berners-Lee calls them. This permits the Internet of Things to evolve into the internet of and for people. Ultimately we hope that this will emerge as the Internet of Life (including people but also considering every other living species that is part of the ecosystem).
At the core of data farming is the understanding that “Life is learning”. And learning is finding connections that you did not see before. The key building blocks of the new internet as operated by the ten largest enterprises are (1) data collection, and (2) the Cloud. There is a third building block of the internet of tomorrow: (3)learning. While the first two are centrally controlled and are rather easy to control by the dominant internet operators, the third “learning” is where the dataminers are clearly and badly underperforming. This leaves a unique window of opportunities for local initiatives to stand up and perform in data services.
Learning about life cannot be done in the Cloud, unless you totally neglect culture, tradition, geography, and the ecosystems on which life depends in each part of the world. Learning can only succeed through interaction of people who live and share within a local and cultural context. So it is the time and place-based learning where data farming, even at a small and local scale can outperform the grand dataminers. How? Because real learning is by definition local and interactive! Learning cannot be centralized in supercomputers with loads of historic and user data exploited by smart algorithms, or even Artificial Intelligence. Learning builds on local culture and intelligence first, the capacity to grasp where and how people live. That is why data farming is in the first place offering a chance to build community and celebrate culture and tradition, instead of globalizing the world with the same games and uniform multiple choice questions that force children to regurgitate the known answers instead of discovering the questions to which there are no answers.
It is surprising that the giants of informatics have invested massively in infrastructure to gather data, and in the Cloud to store and process data to fit the same uniform criteria. These conditions offer dataminers the capacity to manage proprietary data. However, they do not invest at all - nor have the capacity to date - in learning systems. Their highly centralized approach greatly handicaps them learn from systems that permit to adjust to the local conditions and create a real empowerment to undertake a betterment of life. This is the competitive advantage that represents an unparalleled opportunity for data farming to emerge and operate successful in immediate vicinity of communities. Learning makes sense out of complexity and diversity for the common good, which is by definition different for each community.
This core concept has been worked on by pioneers of chatting services like Guillaume Asselot, the founder of Tree Chat, a game-ified chat for collaborative data gathering that starts within a small community and leads to joint actions through and seeing the connections. The chat grows just like trees where ideas emerge and merge until there is sufficient fertile to blossom, and pass on to action. Just like the tree in the central market sets the stage for a community around which local life evolves simply by providing shade and majestic beauty, this Tree Chat permits a natural and spontaneous evolution of information to gather and to increase in relevance. As people discover the common interests and a common vision in their communities, or simple and basic needs that were not recognized before, the local data farming takes root. It starts local and small scale by learning about each other.
3.1. From Global and Exclusive to Local and Human Scale
Data farming is local. While a data farm may seem insignificant, we may need to be reminded that the macro-economy is the amalgam of the micro-economy. Just like one tree is only a tree, and a flower is only a flower, a few trees, bushes, mosses and grasses evolve into a resilient and efficient ecosystem with a continuous enrichment of its biodiversity thanks to feedback loops and multiplier effects. Just like farmers created cooperatives and communities to jointly purchase materials, process the harvest, and sell together with the aim of having better revenues, strengthening the bargaining power, all at once we realize that the same can be undertaken with data.
The parallel logic is powerful: every house - even in shanty towns has light. Each light should be the most efficient LED saving energy like never before, that is equipped with a chip that permits to process and transmit data. This technology is available today. Technologies however are not the game changers. The business model that deploys clusters of technologies are the real frameworks that permit to transform reality.
Each device, from a phone to a tablet, a video game toolbox or a home television set can connect over the light network to a central server at home, or with a friendly neighbor. Today’s laptop computers have enough power to assume that role. This can quickly grow into a collective patchwork of small servers located in basements or attics without using any of the existing radio waves. Instead we can use existing copper cables which run through every house and every room. Later, when time is ripe, these can be complemented with optical fibers and expand to a patchwork of connected lights, laptops and LANs (local area networks).
Let us be clear: we are not against radio waves! We are in favor of high speed and volume, the democratization and inclusiveness of the internet, the sharing of the revenues from data farming, and especially controlling our own data as well as participate in the revenues associated. We wish to farm information into a useful set of interrelated facts permitting a better understanding of how to live sustainable, creative, healthier, even happier, and how to be of service, building resilience and strengthening the common good. This is not a romantic view, but rather a vision that can be rendered reality immediately thanks to its simplicity of execution: one data farm at the time.
3.2. From Concept to Reality
A few families can work together and join a community data farm with the support of a small village on the countryside, a city ward or a block that is part of a megalopolis. In a traditional framework of internet, there is the protocol known as the API (Application Programming Interface), which permits to access and exchange information. This is made available to small players on the internet who are keen on integrating themselves into the grand information networks like Google and Amazon. For example: the Metro of Paris, one of the giants of data, has an API agreement with Google. Since travelers will question Google (and not the Metro) on how to get from the airport to downtown, the Paris metro offers all its data for free to Google! Google then reserves the right to datamine this vast amount of Q&A generated thanks to the free provision of information by the metro company. The Metro, under the present legal framework has no leverage to negotiate Therefore, even as host of the Olympic Games in 2024, the Metro permits Google and others not just the free access to its updated information, it also allows to exploit all the data that can be mined and earn royally from online advertising.
Data farming as proposed, will create a shift in the concentration of power. The small and yet so valuable information of every detail of life that used to be captured through millions of applications could be hidden from large data companies through carefully designed data farms. At first, this will happen unnoticed. To the operators of datamining the amount of data lost is totally insignificant. However, this small intervention at a local scale could quickly grow into fertile tiny “data” veggie gardens where the “tomatoes” for local consumption are the local data built-up over time and connected with other related facts of high relevance for the local community. Everyone has the chance to discover new connections and has an opportunity to learn, steering the citizens to discover their potential, preferences and opportunities. Veggie gardens can grow under the canopy of trees protected from the harsh sun.
Each simple detail that anyone in town likes to consult about the services of the city, the music that the local community loves to hear, played by local musicians who will be discovered for the first time, the arts programs that are offered by local amateurs (who are often very professional), the transportation time tables, the hospital schedule, the local sports agenda. This information is farmed. It grows and is harvested with a most interesting factor: dramatic reduction in energy consumption and an end to this unparalleled control of data. How is this possible?
3.3. The End of Data Globetrotting
If you land in Tokyo, and do not understand the language, you query Google in your preferred language on how to get to your hotel. Even though Japan has an amazing fiber optic network and a very well functioning mobile telephony with some of the most professional companies like NTT, NTT Data and Softbank, a question entered on a phone in Narita or Haneda Airport travels through these companies around the globe, and is ultimately stored in at least 3 servers in unknown overseas locations. This needless shipment of data helps fill up the trans-Pacific optical fiber cables and its mirrored storage consumes so much energy emitting an excessive amount of carbon. Thousands of simultaneous demands are popping up all around the airport and one million questions are captured in 40 million data units: who, from where, whereto, time, arriving from … This flow of information covering every imaginable subject fills the data pipelines, and builds up the data tanks. If one were to use the data farming concept then the info on Tokyo Airport needed by someone in Tokyo Airport is responded to immediately from Tokyo Airport, and kept on local servers. Makes sense?
Actually it does make so much sense to mitigate the clogging of the information pipelines with so much data moving across oceans that only has one purpose: feed a dozen data miners. This subsequently puts pressure to build more transoceanic cables and to facilitate more infringement on privacy.
Data transmission companies supply the largest data mining companies an ever increasing amount of details on the life of billions. These local operators will seek investments, permits and guaranteed revenues from governments in order to unknowingly export all the national data out of the country. The data is not sent once, but many times over and kept on foreign servers. Data miners will even request government subsidies for transoceanic cables. If you analyze the legislation that accompanies these activities then we realize that citizens are data suppliers and are treated like slaves: no voice and no rights.
If on the other hand, the data is processed locally through the light infrastructure without the need for wireless communication or the global fiber optic cable networking, not even the satellite connectivity, then the information is gathered on site only (like a home, a community or an airport), and controlled on site. Thus, the relationship between data supply and data demand is direct, without intermediaries, nor analysis with the sole aim to sell and feed consumerism spurred by online sales where everything is available with a click.
The data farming is not only about our privacy, the capacity to learn about our own use and needs, it is like organic farming: ensuring that we replenish the soil, the community without an excessive consumption of energy. This permits to safeguard the integrity of the information and the privacy of the supplier and create a platform to learn about one self and become resilient by knowing who you are and what is really needed.
3.4. From Local Farming to AgroForestry of Data
Data farming will ensure local composting, cross fertilization, natural hybrids, local pest and viral control through simple interventions with complementary species, avoiding at all cost to go for monocultures and economies of scale. This puts a stop to pests and replenishes the soil. Just like the parallel of datamining and petroleum holds up all too well, so does the comparison between data farming and local organic farming. Just as if all the nutrients are gone from the local soil and we rely on a global food supply highly dependent on fertilizers, so we could understand the comparable logic that all data will be gone and local communities are deprived of their livelihood.
Local data farming secures growing and understanding local resources, the local weather and the social tissue that permits to have a captive and stable community. It is like an ongoing search for ever better, for synergies and complementarities. A network of data farms evolve into data agroforestry, with pillars of common interest structured around centers for data that are analogue to water, seed banks, localized weather forecast and sources of shade. Do we realize that a forest generates 500 tons of biomass per hectare per year often starting with a very poor soil that is enriched over time into a lush ecosystem, while a monoculture farming of soy and corn can barely provide 10 tons of biomass per year and depletes the soil, increasingly dependent on outside supply of chemistry, fertilizers, and even genetics. Data farming versus data mining represents exactly the same.
The revolution of data farming and data agroforestry sets the stage of how data gathering can be inspired by ecosystems and natural farming, not just the computational skills. It is the business model as well as the societal model that underpins this transition. Just as a fast urbanization facilitated datamining, re-ruralization or the return to life within the carrying capacity will promote data farming. This enriches the local community, using local clusters of information that offer a solid backbone to a resilient society with a culture of lifelong learning. The community masters all its data, provides all key available information locally without cluttering servers and fiber optic networks while saving up to 90% in energy. Now the internet turns into the promotor of the most efficient lights that merge with routers into one local data relay that is so precise, even unhackable that it makes 5G antennas look like dinosaurs even before their massive rollout.
Most important: this data farming sets up the communities to control their data, to protect their privacy but also to set them up to learn and to earn their fair share of income that is today totally beyond their grasp.
A search for the latest results of the Football Club Barcelona brings you to the website of the club. It only lets you read any of their public information if the visitor permits their system to insert over and above the cookies the growers is using an additional 788 cookies. These cookies are classified in detail as: 221 strictly necessary which you cannot refuse, 30 to add functionality, 194 for analysis and 343 behavioral advertising. Thanks to the European Union’s efforts you can decline 567, but not the first 221! The nearly 2,000 cookies within just 2 clicks from connecting to Bing and Barcelona is what is required to learn about the latest scores of your favorite club. This hard reality offers you an idea how far datamining has turned sophisticated. We suspect that a brand like FC Barcelona is able to put some pressure on Bing and Google to share the pot of gold, and that the search engines have advised Barcelona how to get more data and money out of the enthused fans. You and I are merely fodder for the system.
Therefore, it is time to focus on the new concept of data farming. Actually, this is not so new, it is the way the internet operated in its infant stage. Conscious communities, aware of the dramatic and adverse effects of datamining for privacy and the local economy have the capacity to create an enriching learning environment where discoveries are made and challenges are continuously addressed with care and attention for the common good, building resilience. Local entrepreneurs could create the initial building blocks of this novel approach, starting with a proprietary platform for data farming based on light transmissions. And, there are thousands (even millions) that will quickly join to build on open source concepts from software to hardware to facilitate access and governance. No one expects this to be a roaring success from the outset, however this will create a small opening into the world market of datamining. Just like a few cities in the world decided to aim early on for carbon neutrality and some companies want to be recognized as pioneers in zero emissions, communities will pioneer the data farming. Along the same logic, more and more “minorities” wake up to the fact that datamining permeated in a matter of a decade all our digital lives. Datafarming as a way of life, a new culture of caring for one and the other’s data, could follow the same path and prepare for a life long learning. While both datamining and data farming could live in the same space, at least - people are offered a choice.
Once data farming communities can exchange in all transparency, they will be able to build a diverse community that welcomes different opinions and where the points of view can be emotional, scientific or business-like, but all are respected. There is no private entity that dictates the rules to the participants and the community in small print and overt threats. On the contrary it is the community and its citizens that determine the core conditions that set the framework. Since there is such a proximity of all participants, and the platform is thriving on continuous learning that the overall spirit of the data farming permits everyone to contribute to the health and resilience, even those we do not understand.
We can even look our neighbors into the eye and smile.
And that is contagious.
 While the English language would prescribe the fungus to be described as “it”, we consider it difficult to accept this grammatical form when the level of intelligence of this sentient being clearly surpasses our expectations.
The aim of this blog is to present a fresh look at realities around us. Whereas I do not pretend to present the truth nor a definite position, I do wish to push the reader to think beyond the obvious. After all, time has come to dramatically improve the plight of millions, and that requires more than the predictable. Sometimes it forces us into spheres of discomfort.