AI-Enabled 6G: Embracing Wisdoms from Classical Algorithms |
||
Abstract: The past five years have witnessed ever-increasing research interests in artificial intelligence (AI) for the design of 6G wireless systems [1,2]. Despite the unprecedented performance gains, the black-box nature of existing AI algorithms has aroused many crucial concerns, e.g., insufficient scalability, poor generalization, and the lack of theoretical guarantees, which contradict the stringent reliability requirements in practice. By contrast, classical algorithms mostly enjoy well-grounded theoretical analysis. However, built upon simplified signal and system models, their performance tends to be limited in complicated real-world deployments. In this talk, we begin by introducing 6G vision, challenges, and opportunities. Then and by bridging AI with the wisdoms from classical algorithms, we introduce two general frameworks that may offer the best of both worlds, i.e., both competitive performance and theoretical supports. The first framework, called neural calibration, targets low-complexity non-iterative algorithms. Based on the permutation equivariance property, neural calibrated algorithms can scale with the problem size and generalize with varying network settings, making them suitable for dynamic large-scale systems. The second framework, termed fixed point networks, is compatible with general iterative algorithms that are prevalent in wireless transceiver design. Based on fixed point theory, provably convergent and adaptive AI-enhanced iterative algorithms can be constructed in a unified manner. Along with the general frameworks, we also present their applications to CSI feedback, beamforming, and channel estimation, among others, in emerging 6G wireless systems. [1] K. B. Letaief, Y. Shi, J. Lu, and J. Lu, “Edge Artificial Intelligence for 6G: Vision, enabling technologies, and applications,” IEEE Journal on Selected Areas in Communications, vol. 40, no. 1, pp. 5-36, Jan. 2022. [2] K. B. Letaief, W. Chen, Y. Shi, J. Zhang, and Y-J Zhang, “The roadmap to 6G: AI empowered wireless networks,” IEEE Communications Magazine, vol. 57, no. 8, pp. 84-90, Aug. 2019. |
IEEE Fellow, |
Some Methods to Improve IoT Performance and Cybersecurity |
||
Institute of Theoretical & Applied Informatics Polish Academy of Sciences & Laboratoire I3S Univ. Cote d’Azur & Cognitive Networks Ltd, UK |
Abstract: The relative simplicity and lightweight nature of many IoT devices, and their widespread connectivity via the Internet and other wired and wireless networks, raise issues regarding both their performance and vulnerability. Indeed, their own connectivity patterns based on the need to frequently forward and receive, data has given rise to the « Massive Access Problem (MAP) of the IoT » which is a form of congestion caused by the IoT’s synchronized and repetitive data transmission patterns. On the other hand, the opportunity that IoT devices present to malicious third parties for generating highly contagious distributed denial of service (DDoS) and Botnet attacks, is also a subject of concern which is widely studied. Thus this presentation will discuss our recent results and research directions that address both of these issues. Regarding the MAP, we will outline the Quasi-Deterministic Transmission Policy (QDTP), and its main theoretical result, and present trace driven measurements, which show that QDTP can effectively mitigate MAP. We will also show how a Machine Learning approach using novel Auto-Associative Dense Random Neural Networks can detect DDos attacks with a high degree of accuracy, and discuss the potential of « low cost » online learning to protect IoT gateways and devices against Cyberattacks. The speaker gratefully acknowledges research funding from the EC as part of the H2020 GHOST, SerIoT and IoTAC projects. |
Leveraging Urban Computing with Smart Internet of Drones |
||
Abstract: Urban computing (UC) is an interdisciplinary field that seeks to improve people’s lives in urban areas. To achieve this objective, UC collects and analyzes data from several sources. In recent years, the Internet of Drones (IoD) has received significant attention from the academia community and has emerged as a potential data source for UC applications. The goal of this talk is to examine how IoD can connect and leverage UC in variety of applications which include: public safety and security, environment, traffic improvement, drone-assisted networks, just to mention a few. In this context, data acquired by IoD can fill gaps in data collected from other sources and provide new data for UC considering the aerial view of drones. Thus, we shall first introduce the relationship between the concepts of UC and IoD, and then discuss our proposed general framework considering the perspective of IoD for UC followed by design guidelines of the Internet of drones location privacy protocols. Last but not least, we shall discuss some key challenges in this emerging area. |
Canada Research Chair Tier-1 and Distinguished University Professor, University of Ottawa, Canada |
Generative Artificial Intelligence: Opportunities and Challenges |
||
IEEE Fellow, Professor of ML at MBZUAI, UAE |
Abstract: The talk presents recent trends and significant advances in Artificial Intelligence (AI), namely Generative Artificial Intelligence (GAI). As demonstrated by impressive accomplishments made in the field (such as ChatGPT, BARD, LLaMA and other generative AI-based engines) and due to fundamental advances in machine learning and artificial intelligence, many predict we are at the cusp of a new technological revolution, the impact of which will affect all humanity. AI is expected to grow the world GDP by up to 20% by 2025. This amounts to more than 15 Trillion dollars of growth over the next few years. These developments have impacted significantly technological innovations in the fields of Internet of Things, self-driving machines, powerful chatbots, virtual assistants, human machine intelligent interfaces, large language models, real-time translators, cognitive robotics, high quality disease diagnosis, remote health care monitoring, financial market prediction, Fintech, to name a few. Although AI constitutes an umbrella of several interrelated technologies, all of which are aimed at imitating to a certain degree intelligent human behavior or decision making, deep learning algorithms are considered to be the driving force behind the explosive growth of AI and their applications in almost every sector of the modern and global economy. The talk outlines major milestones that led to the current growth in AI, and GAI, the role of academic institutions, industry, and government and discusses some of the significant achievements in the field. It also highlights real challenges when these innovations are misused, leading to potential adverse effects on the society and the end-users. |
Visible light communication (VLC) for cars |
||
Abstract: Visible light communications (VLC) is a promising technology for automotive applications because it offers several advantages over traditional wireless communication technologies using radio waves. For example, VLC is more secure, as it is difficult to intercept VLC signals, and VLC is more reliable, as VLC signals are less susceptible to interference from other signals. As a result of these advantages, VLC is being considered for various automotive applications, such as in-car infotainment and navigation systems, advanced driver assistance systems (ADAS), and autonomous driving. In this talk, he will explain VLC for cars using image sensor communication, one of the VLCs that use a camera as a receiver. |
Professor and Deputy Director at the Institute of Liberal Arts and Sciences, Nagoya University, Japan. |