[Internet of Things Opportunities in 2020]
With technology advancement in all layers of IoT, from cloud computing to sensors and connectivity modules, to LTE and now 5G, the industry is poised with more opportunities than ever before. And with continued reduction in implementation cost, we are close to crossing the chasm to mainstream use, as the world sees more and more uptake from small and medium businesses, what are the opportunities on the horizon?
#Opportunity_One_Device_Enablement
Without device enablement, there will be no IoT. Rather than building an IoT device from scratch, some of the fastest growing startups in this space are building solutions which can enable all types of devices to connect with the internet, building the enablement layer means helping any smart device, machinery, or even household appliances to efficiently connect to the internet. Although the challenge lies in consolidating standards against big brands and country specific regulations, this layer is estimated to grow at a CAGR of 24% p.a and poised to be a US$ 30 billion market by 2023.
#Opportunity_Two_Cloud_Computing
It goes without saying that cloud computing has been one of the fastest growing spaces in the last decade. Companies like AWS/GCP/AZURE provide large data storage capacity and computing power to fuel sophisticated functionality, security, and analytics. The real opportunity here is to apply existing research data with cloud computing and IoT devices to provide results previously impossible to execute. Agriculture is currently at the frontier of these applications, by applying plant nutrition data from Ecology, Philips Lighting’s Growiser Center can monitor, track, and control the environment that influence a plant’s growth, however, manipulating the environment comes at a cost, but by leveraging IoT to optimize the environment whilst balancing the cost, Philips have managed to maximize yield profit and quality for their growers like never before.
#Opportunity_Three_Business_Applications
With steady advancement in technology and investment in the lower layers of IoT, the requirement to explore business applications has become much lower. However, due to the high financial opportunities, the business application layer will continue to be very fragmented. It is up to the founders to study, explore, and consolidate the possibilities that lie within the application layer, and to build something that businesses and individuals can truly benefit from to reap the rewards.
If you are a founder working on any layers in IoT, or a founder that is interested in working with other IoT founders, we welcome you to join our network of over 1,100 founders from around Southeast Asia and Taiwan. The application to join AppWorks Accelerator will open in May, you can check out more information by following this link here: https://bit.ly/3eLDKWk
by Jack An, Analyst
「iot network layer」的推薦目錄:
iot network layer 在 國立陽明交通大學電子工程學系及電子研究所 Facebook 八卦
【演講】2019/11/19 (二) @工四816 (智易空間),邀請到Prof. Geoffrey Li(Georgia Tech, USA)與Prof. Li-Chun Wang(NCTU, Taiwan) 演講「Deep Learning based Wireless Resource Allocation/Deep Learning in Physical Layer Communications/Machine Learning Interference Management」
IBM中心特別邀請到Prof. Geoffrey Li(Georgia Tech, USA)與Prof. Li-Chun Wang(NCTU, Taiwan)前來為我們演講,歡迎有興趣的老師與同學報名參加!
演講標題:Deep Learning based Wireless Resource Allocation/Deep Learning in Physical Layer Communications/Machine Learning Interference Management
演 講 者:Prof. Geoffrey Li與Prof. Li-Chun Wang
時 間:2019/11/19(二) 9:00 ~ 12:00
地 點:交大工程四館816 (智易空間)
活動報名網址:https://forms.gle/vUr3kYBDB2vvKtca6
報名方式:
費用:(費用含講義、午餐及茶水)
1.費用:(1) 校內學生免費,校外學生300元/人 (2) 業界人士與老師1500/人
2.人數:60人,依完成報名順序錄取(完成繳費者始完成報名程序)
※報名及繳費方式:
1.報名:請至報名網址填寫資料
2.繳費:
(1)親至交大工程四館813室完成繳費(前來繳費者請先致電)
(2)匯款資訊如下:
戶名: 曾紫玲(國泰世華銀行 竹科分行013)
帳號: 075506235774 (國泰世華銀行 竹科分行013)
匯款後請提供姓名、匯款時間以及匯款帳號後五碼以便對帳
※將於上課日發放課程繳費領據
聯絡方式:曾紫玲 Tel:03-5712121分機54599 Email:tzuling@nctu.edu.tw
Abstract:
1.Deep Learning based Wireless Resource Allocation
【Abstract】
Judicious resource allocation is critical to mitigating interference, improving network efficiency, and ultimately optimizing wireless network performance. The traditional wisdom is to explicitly formulate resource allocation as an optimization problem and then exploit mathematical programming to solve it to a certain level of optimality. However, as wireless networks become increasingly diverse and complex, such as high-mobility vehicular networks, the current design methodologies face significant challenges and thus call for rethinking of the traditional design philosophy. Meanwhile, deep learning represents a promising alternative due to its remarkable power to leverage data for problem solving. In this talk, I will present our research progress in deep learning based wireless resource allocation. Deep learning can help solve optimization problems for resource allocation or can be directly used for resource allocation. We will first present our research results in using deep learning to solve linear sum assignment problems (LSAP) and reduce the complexity of mixed integer non-linear programming (MINLP), and introduce graph embedding for wireless link scheduling. We will then discuss how to use deep reinforcement learning directly for wireless resource allocation with application in vehicular networks.
2.Deep Learning in Physical Layer Communications
【Abstract】
It has been demonstrated recently that deep learning (DL) has great potentials to break the bottleneck of the conventional communication systems. In this talk, we present our recent work in DL in physical layer communications. DL can improve the performance of each individual (traditional) block in the conventional communication systems or jointly optimize the whole transmitter or receiver. Therefore, we can categorize the applications of DL in physical layer communications into with and without block processing structures. For DL based communication systems with block structures, we present joint channel estimation and signal detection based on a fully connected deep neural network, model-drive DL for signal detection, and some experimental results. For those without block structures, we provide our recent endeavors in developing end-to-end learning communication systems with the help of deep reinforcement learning (DRL) and generative adversarial net (GAN). At the end of the talk, we provide some potential research topics in the area.
3.Machine Learning Interference Management
【Abstract】
In this talk, we discuss how machine learning algorithms can address the performance issues of high-capacity ultra-dense small cells in an environment with dynamical traffic patterns and time-varying channel conditions. We introduce a bi adaptive self-organizing network (Bi-SON) to exploit the power of data-driven resource management in ultra-dense small cells (UDSC). On top of the Bi-SON framework, we further develop an affinity propagation unsupervised learning algorithm to improve energy efficiency and reduce interference of the operator deployed and the plug-and-play small cells, respectively. Finally, we discuss the opportunities and challenges of reinforcement learning and deep reinforcement learning (DRL) in more decentralized, ad-hoc, and autonomous modern networks, such as Internet of things (IoT), vehicle -to-vehicle networks, and unmanned aerial vehicle (UAV) networks.
Bio:
Dr. Geoffrey Li is a Professor with the School of Electrical and Computer Engineering at Georgia Institute of Technology. He was with AT&T Labs – Research for five years before joining Georgia Tech in 2000. His general research interests include statistical signal processing and machine learning for wireless communications. In these areas, he has published around 500 referred journal and conference papers in addition to over 40 granted patents. His publications have cited by 37,000 times and he has been listed as the World’s Most Influential Scientific Mind, also known as a Highly-Cited Researcher, by Thomson Reuters almost every year since 2001. He has been an IEEE Fellow since 2006. He received 2010 IEEE ComSoc Stephen O. Rice Prize Paper Award, 2013 IEEE VTS James Evans Avant Garde Award, 2014 IEEE VTS Jack Neubauer Memorial Award, 2017 IEEE ComSoc Award for Advances in Communication, and 2017 IEEE SPS Donald G. Fink Overview Paper Award. He also won the 2015 Distinguished Faculty Achievement Award from the School of Electrical and Computer Engineering, Georgia Tech.
Li-Chun Wang (M'96 -- SM'06 -- F'11) received Ph. D. degree from the Georgia Institute of Technology, Atlanta, in 1996. From 1996 to 2000, he was with AT&T Laboratories, where he was a Senior Technical Staff Member in the Wireless Communications Research Department. Currently, he is the Chair Professor of the Department of Electrical and Computer Engineering and the Director of Big Data Research Center of of National Chiao Tung University in Taiwan. Dr. Wang was elected to the IEEE Fellow in 2011 for his contributions to cellular architectures and radio resource management in wireless networks. He was the co-recipients of IEEE Communications Society Asia-Pacific Board Best Award (2015), Y. Z. Hsu Scientific Paper Award (2013), and IEEE Jack Neubauer Best Paper Award (1997). He won the Distinguished Research Award of Ministry of Science and Technology in Taiwan twice (2012 and 2016). He is currently the associate editor of IEEE Transaction on Cognitive Communications and Networks. His current research interests are in the areas of software-defined mobile networks, heterogeneous networks, and data-driven intelligent wireless communications. He holds 23 US patents, and have published over 300 journal and conference papers, and co-edited a book, “Key Technologies for 5G Wireless Systems,” (Cambridge University Press 2017).