Innovating Learning with Canv-AI: A GenAI Solution for Canvas LMS

17

October

2024

No ratings yet.

In today’s educational landscape, generative AI (GenAI) is reshaping how students and instructors interact with learning platforms. A promising example is Canv-AI, an AI-powered tool designed to integrate into the widely used Canvas Learning Management System (LMS). This tool aims to transform both student learning and faculty workload by leveraging advanced AI features to provide personalized, real-time support.

The integration of Canv-AI focuses on two primary groups: students and professors. For students, the key feature is a chatbot that can answer course-specific questions, provide personalized feedback, and generate practice quizzes or mock exams. These features are designed to enhance active learning, where students actively engage with course material, improving their understanding and retention. Instead of navigating dense course content alone, students have instant access to interactive support tailored to their learning needs.

Professors benefit from Canv-AI through a dashboard that tracks student performance and identifies areas where students struggle the most. This insight allows instructors to adjust their teaching strategies in real-time, offering targeted support without waiting for students to seek help. Additionally, the chatbot can help reduce the faculty workload by answering common questions about lecture notes or deadlines, allowing professors to focus more on core teaching tasks.

From a business perspective, Canv-AI aligns with Canvas’s existing subscription-based revenue model. It is offered as an add-on package, giving universities access to AI-driven tools for improving educational outcomes. The pricing strategy is competitive, with a projected $2,000 annual fee for universities already using Canvas. The integration also brings the potential for a significant return on investment, with an estimated 29.7% ROI after the first year. By attracting 15% of Canvas’s current university customers, Canv-AI is expected to generate over $700,000 in profit during its first year.

The technological backbone of Canv-AI relies on large language models (LLMs) and retrieval-augmented generation (RAG). These technologies allow the system to understand and respond to complex queries based on course materials, ensuring students receive relevant and accurate information. The system is designed to be scalable, using Amazon Web Services (AWS) to handle real-time AI interactions efficiently.

However, the integration of GenAI into educational systems does come with challenges. One concern is data security, especially the protection of student information. To address this, Canv-AI proposes the use of Role-Based Access Control (RBAC), ensuring that sensitive data is only accessible to authorized users. Another challenge is AI accuracy. To avoid misinformation, Canv-AI offers options for professors to review and customize the chatbot’s responses, ensuring alignment with course content.

In conclusion, Canv-AI offers a transformative solution for Canvas LMS by enhancing the learning experience for students and reducing the workload for professors. By integrating GenAI, Canvas can stay competitive in the educational technology market, delivering personalized, data-driven learning solutions. With the right safeguards in place, Canv-AI represents a promising step forward for digital education.

Authors: Team 50

John Albin Bergström (563470jb)

Oryna Malchenko (592143om)

Yasin Elkattan (593972yk)

Daniel Fejes (605931fd)

Please rate this

How Will Generative AI Be Used in the Future? Answer: AutoGen

21

October

2023

No ratings yet.

The generative AI we know of today is ChatGPT, Midjourney, and DALL·E 3 and many more. This generative AI is very good and advanced, but there are some flaws, like not being able to perform long iterations. Now there is something new called AutoGen. AutoGen is an open-source project from Microsoft that was released on September 19, 2023. AutoGen at its core, is a generative AI model that works with agents; those agents work together in loops. Agents are in essence, pre-specified workers that can become anything, so there are agents that can code well and agents that can review the generated code and give feedback. Agents can be made to do anything and become experts in any field, from marketing to healthcare.

An example of what AutoGen can do is the following: if I want to write some code to get the stock price of Tesla, I could use ChatGPT, and it will output some code. Most of the time, the code that is written by chatGPT via the OpenAI website will have some errors. But with AutoGen, there are two or more agents at work: one that will output code and the second one that is able to run the code and tell the first model if something is wrong. This process of generating the code and running the code will go on until the code works and results in the correct output. This way, the user does not have to manually run the code and ask to fix the errors or other problems with AutoGen it is done automatically.

I also tried to create some code with AutoGen. I first installed all the necessary packages and got myself an API key for openAI GPT4. Then I started working on the code and decided to create the game “Snake”. Snake is an old and easy game to create, but it might be a challenge for AutoGen. I started the process of creating the snake game, and it had its first good run. I was able to create the first easy version of the game. I then came up with some iterations to improve the game. The game now also has some obstacles that, if the snake bumps into one, the game will end. This was also made by AutoGen without any problems. After palying around, I was really amazed at how powerful this AutoGen is, and I can only imagine what else can be created with AutoGen.

AutoGen is a very promising development and will be the future of professional code development or atomization tasks. If the large language models (LLMs) get more powerful, this AutoGen will also be more powerful because all the individual agents will be more powerful. It is interesting to follow this development and see if this AutoGen could create games that are not yet existing.

Please rate this

Diagnosis: Cyberattack – A New Threat for Healthcare

2

October

2020

5/5 (1) Cybercrime and healthcare… One might think what a weird combination – right? However, I have to disappoint you. It is a cruel reality.

But let’s start at the beginning… the enabler: It is, what a ’surprise’, the increasing use of technology in the healthcare industry. But using technology does not only imply risks. We all know how beneficial technology in healthcare is. No matter which technology, it (most of the time…) all comes down to an increase in efficiency and effectiveness (AIMS Education 2019). Furthermore, those improvements aim to increase our quality of life while, hopefully, reduce its costs (AIMS Education 2019).

One of the easiest and best examples of technological adoption in healthcare is the digitalization of health records (Online Health Informatics 2020). Do you remember one of your doctors using a paper record? No? Me neither. This example might sound too simple to be true. However, digital healthcare records had a positive impact on not only the quality of public healthcare but also its costs. Those records can be communicated through the Internet of Things (IoT) within hospitals and stored in, e.g., clouds (Jayaraman et al. 2019).

The consequences are tremendous: Due to the sensitivity of medical data, its value is constantly increasing, making it a vulnerable target for cybercrime (Jayaraman et al. 2019). To get a glimpse of how valuable healthcare records are; it is up to 20x higher when compared to credit card details…

Cybercrime – two real-world examples and its dramatic consequence(s): The most recent (known) happened this Monday (28/09/20). The American hospital chain ‘Universal Health Services’ with its over 250 hospitals experienced an IT outage due to a cyberattack – causing no access to medical records and everything connected to WiFi (including the devices that monitored critical care patients) (CBS News 2020). Luckily, this cyber attack had no fatalities. The latter, however, happened two weeks earlier to a hospital in Düsseldorf, Germany. There, a cyberattack caused the death of a critical patient (The Guardian 2020)…

Even though it is highly unethical to put monetary gains over human life; I do personally think that this trend will continue. The increasing use of interconnected devices in healthcare will create even more sensitive data which will make it an even more attractive target to hackers…

What do you think? Will this trend will continue, or are technological enhancements, such as blockchain, chances to put an end to it? Let me know in the comments!

 

References:

AIMS Education. (2019). The Impact Of Technology In Healthcare. [online] Available at: <https://aimseducation.edu/blog/the-impact-of-technology-on-healthcare> [Accessed 1 October 2020].

CBS News. (2020). Cyberattack Hobbles Hospital Chain Universal Health Services. [online] Available at: <https://www.cbsnews.com/news/cyberattack-universal-health-services-hospital-chain-united-states/> [Accessed 1 October 2020].

Jayaraman, P. P. et al. (2020) “Healthcare 4.0: A Review of Frontiers in Digital Health,” Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(2).

Online Health Informatics. (2020). 5 Ways Technology Is Improving Health. [online] Available at: <https://healthinformatics.uic.edu/blog/5-ways-technology-is-improving-health/> [Accessed 1 October 2020].

The Guardian. (2020). Prosecutors Open Homicide Case After Cyber-Attack On German Hospital. [online] Available at: <https://www.theguardian.com/technology/2020/sep/18/prosecutors-open-homicide-case-after-cyber-attack-on-german-hospital> [Accessed 1 October 2020]

Please rate this

Is Internet of Things driving our World?

7

October

2016

4.92/5 (13) The world of technology is becoming more and more familiar with Internet of Things; it has entered our lives and is everywhere around us, but the definition is not quite new. It was back in 1999 when Kevin Ashton, a British technology pioneer working at Auto-ID labs at Massachusetts Institute of Technology (MIT), coined the term “Internet of Things”. But what is actually Internet of Things (IoT) and why it deluged in our lives?

Internet of Things is considered any connection (inter-networking) between physical devices/vehicles/buildings and electronics/software/sensors, enabling them to collect and exchange data. It allows objects to be sensed and controlled remotely across existing network infrastructure, creating opportunities for more direct integration of the physical world into computer-based systems. A short but accurate definition for IoT could be: “the infrastructure of the information society”. IoT is expected to offer advanced connectivity of devices and services that goes beyond machine-to-machine (M2M) communications. In fact, with IoT what in an earlier age would have been defined as magic is now a reality; designed, planned and operated by technology pioneers around the world.

Nowadays, progressively, big innovative companies, with the technology industry as the pioneer, design and develop devices capable to connect to the internet and interact with their environment making up a new way to collaborate, innovate and socialize. Smart devices like smartphones, smartwatches, electric cars and home equipment are some of the leading actors driving the evolution of Internet of Things. A recent example concerning IoT is the smart city project, a very ambitious plan which has as main goal to manage a city’s assets through communication technology in order to improve quality of life.

Although IoT has become a crucial part of our everyday life, there is a major challenge web-connected product designers face and try to vanquish: how to make the devices self-powering. Some smart devices, particularly wearable ones, consume enough energy during the day driving users to devote attention for regular charging. It can become a really annoying brain teaser especially if you have five or ten smart devices. To tackle this, wearable devices exploit movement and flexion in order to draw off energy while other devices take advantage of their environment using photovoltaic cells to generate solar power or any other available infrastructure.

The Internet of Things poses some challenges, but also a world of opportunities, because it is applicable to a wide spectre of sectors and markets. From transportation industry to clothing and from agricultural industry to entertainment the take-off of IoT will shift the organizations and the businesses to innovate, increase efficiency and become more sustainable Also, predictions range from 20 to 50 billion products being connected to the Internet by the end of this decade and all of them will be designed to make life easier for us.

 

 

 

References:

https://digitalstrategy.rsm.nl//2016/10/02/smart-cities-how-will-they-change-our-lives/

https://www.technologyreview.com/s/602114/the-internet-of-things-outlook-and-challenges/?utm_campaign=internal&utm_medium=homepage&utm_source=collection_3&set=535821

http://www.itu.int/en/ITU-T/gsi/iot/Pages/default.aspx

 http://www.postscapes.com/internet-of-things-history/

 https://www.ericsson.com/spotlight/services/internet-of-things/

 https://en.wikipedia.org/wiki/Kevin_Ashton

https://en.wikipedia.org/wiki/Smart_city

 

 

Please rate this

Revolutionary Tech: Quantum Computing

4

October

2016

No ratings yet. In my last blog, I was talking about Moore’s Law and how it is running out of steam. A possible solution to replace Moore’s law is quantum computing. Quantum computing does not really exists yet, but major companies (Intel or IBM) are working on developing such a computer. (Intel, 2016)  If the development is succesfull, it will be groundbreaking, disrupting many existing technologies that we are currently familiar with. It could even potentially be terrifying if used wrong.

To explain quantum computing in a simple way; imagine a normal computer. It processes bits, that can be either 1 and 0.  A quantum computer can have both states at once (qubit). (Wikipedia, 2016) The processing power it will have is enourmous. It could calculate specific algorythms(multi-tasking) in just a fraction of the time a normal processor would.  Truly Revolutionary! For more information: Click here (credits to them)

maxresdefault

So what are the implications of a quantum computer?

Hackers can penetrate most of your private information, as if there was no security. Data encryption as we know it would be nearly nullified. (makeuseof.com, 2014) On the other hand: What about the NSA, trying to analyze a lot of data about everyone? It could become really creepy, a world where everything is predictable and privacy would almost be nonexistent.iab-urges-data-encryption

Even the development of artificial intelligence would have a huge boost with many benefits and negatives. It will perhaps be possible that machines become smarter than humans and quantum computing might be a cornerstone to achieve this.
This could be a development that would dramatically change businesses as well, just like the internet did back in the day.

So do we really want to have a quantum computer. The likely answer is yes, but there are many things that need to be considered first in my opinion. For example privacy concerns and security concerns. Eventually I think this will become an essential technology just like we are used to internet nowadays. There will be solutions found for many problems that occur as it has shown over time, but it is important to reflect about the consequences of innovative technologies. It might not be always good.

 

Sources:

Click to access promise_of_quantum_computing.pdf

https://en.wikipedia.org/wiki/Quantum_computing

Quantum Computers: The End of Cryptography?

 

 

Please rate this

Moore’s Law is Dead?! What’s next?

3

October

2016

No ratings yet. Much has changed and over the time the computer chip as we know it is becoming more and more important. With internet of things on the horizon and electronics becoming more and more integrated in our daily lives, it is important that chips continue to improve in performance and size.

So what is Moore’s Law? Moore’s Law is the observation of the number of transistors in a chip doubles every approximately two years. (Wikipedia, 2016)  So in short, this “law” is just observations and predictions and dates from 1965. It is remarkable that this is law is accurate for over fifty years.

integrated microchip40a21d1b_Moores_Law

 

The reason why Moore’s Law is so important is because it is highly related to the pace of which performance in a chip is being developed. It being used as a guideline for developing better semiconductors (chips) every approximate two years. Many major companies such as Intel and Nvidia have been using this law as a guideline. (Intel.com, 2016)
Moreover, the performance of a chip is also related to the number of transistors a chip has. Also the size of a chip can decrease whenever chips are becoming more powerful. This is really useful for the Internet of Things.

But nowadays it is increasingly difficult to keep up with Moore’s Law since it is becoming more and more costly for companies to develop a smaller die size. Further improvements are increasingly costly and physically harder to implement. Thus the pace of Moore’s Law is slowing down and in a way dying. Intel actually even stepped off the pace set out by this Law. (MIT, 2016)

Broadwell_and_Atom_Server_Roadmap_Wide

So what will be next? If Moore’s law is running out, does this mean the pace of which semiconductors are being developed is going to be halted drastically?

Probably not, I would think. Although highly costly, alternatives could still allow further improvements by using a different material. (Arstechnica.com, 2016) Or a very disruptive innovation, perhaps quantum computing might be a solution.

Sources :
http://arstechnica.com/gadgets/2016/07/itrs-roadmap-2021-moores-law/

http://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html

https://en.wikipedia.org/wiki/Moore%27s_law

https://www.technologyreview.com/s/601102/intel-puts-the-brakes-on-moores-law/

 

Please rate this