Can climate change be disrupted?

20

September

2021

No ratings yet.

The past few years have turned out to be extremely critical when it comes to climate change and the way we as humans deal with it. Despite the continuous warnings by scientists and the now tangible proof that our planet is in an unprecedented crisis, people are still struggling to make drastic changes in their lifestyle and consumption habits. Governments worldwide are trying to catch-up and take urgent measures to reduce and control fossil fuel emissions within the next decade, as suggested by the recent IPCC’s Sixth Assessment Report published this summer.

However, such a holistic and multidimensional problem requires every mean possible to be set under control. Both people and politicians but also recent technological innovations can help reduce the climate change consequences the world will experience. Recently published paper called “Tackling Climate Change with Machine Learning”(David Rolnick et al., June 2019) suggests that there are numerous problems when it comes to climate crisis for which Artificial Intelligence and Machine Learning can provide feasible solutions. Even though AI “is not a silver bullet“, it is introducing us with new ways of achieving our goal for an environmentally neutral life on earth.

Smart cities

By using data analysis and various IT sensors(e.g. traffic cameras or smartphones), municipalities will be able to detect activities that consume large amounts of energy, such as mobility during peak hours etc. This can easily be regulated by establishing vehicle-sharing companies, like the ones already existing in Rotterdam. These cars will provide relevant-only data for their location, use and condition at all times and they can help with guiding traffic and establishing regulations when necessary.

Using such innovations, it is also possible to create low-emission infrastructures by creating efficient and accessible transportation systems, coordinating district heating/cooling networks, establishing solar power generators and charging stations for electric vehicles but also regulating light intensity on the streets based on historical values on traffic per hour.

Climate prediction

ML models can improve the prediction of extreme events and natural disasters using data collected by ice cores and climate downscaling. It is even possible to combine the predictions of many climate models and thus make more accurate and realistic weather forecasts, that will help governments prepare for upcoming changes and possibly uncover areas that could reverse some effects of climate change.

Another interesting application that AI brings to climate management is the ability to visually project to its users the effects of extreme weather in their home or chosen location. People can now see what is coming in the future, making it easier for them to believe that climate change is real and take action.

The above were only a few of the review’s suggestions regarding the use of innovative technologies on climate change management. For example, AI and ML can also accurately measure where carbon is coming from by analysing global images of power plants and their surroundings and potentially remove it using direct air capture from the facility’s exhaust. Most of the industries worldwide have turn out to be digitized at a great degree, and it is their duty to use their technological advantage to become environmentally neutral in the upcoming years. Although some of these new techniques might need some more time to be further implemented, it is important for businesses and policy makers to start using all the available resources to improve the climate state and save our one and only home. After all, there is no Plan(et) B.

References

Please rate this

Project Natick: The future of cloud computing is located underwater

11

September

2021

4/5 (1)

During my studies, after I got more and more into the details of how businesses operate these days and of the way they use and analyse data, i was convinced of the importance of cloud computing and its role as an essential component of all software solution providers. It is defined as the delivery of computing services on the Internet in order to offer faster innovation, flexible resources and economies of scale. That includes, among others, servers, storage and databases.

The technology has gotten so far on this particular field, that businesses are constantly searching for ways to work more efficiently and reduce their costs, while expanding their influence using data analytics. Microsoft, one of the biggest players in the software industry, not only couldn’t stay out of the game of adapting its technology to the newer norms, but also decided to take it one step further. The highly interactive cloud future increased the need for Microsoft to offer its services close to all users, even those living near the shore, who account for almost half the world’s population. Sean James, a Microsoft employee, had been observing data centers on the ground and pointed some maintenance issues due to poor cooling techniques and un-controlled environment conditions. He later published a research paper in 2013 that started a revolutionary journey. Having been familiar with advanced electronic deployments beneath the sea level, his suggestion was that the above problems would be eliminated if the data centers would be moved underwater. Effectiveness would be increased due to less factors that could cause failures such as oxygen, and the company could be serving a higher number of costumers on a smaller budget, while reducing the need for the last few free spaces on land.

“Project Natick”, which stayed at a theoritical level until 2018, when the first underwater data center was deliberately sank in the Northern Isles. Two years later, on July 2020, it was time to retrieve the capsule and observe the results that are still being reviewed. So far, Microsoft has stated that the hardware which spent two years in the underwater data center was eight times more reliable than equivalent servers running on land. That was recently proven as Natick was used to perform COVID-19 research for Folding at Home and World Community Grid.

The results amazed the software industry. The proven reliability of the underwater data centers is able to help serve costumers who need to deploy and operate tactical and critical databases anywhere on Earth, and they can also help Microsoft maintain a sustainability strategy around energy, waste and water. Phase 2 of the project is already in progress and it is aimed to surface design and operational issues that may occur post-deployment and will require subsequent work, in particular, productization. After all, 50% of us live near the coast, why doesn’t our data?

References:

Please rate this