Tuesday May, 12
Keynote 1: Practical meets Transformational Aspirations
Dr. Ibrahim Gedeon
Chief Technology Officer, TELUS, Canada
Abstract: NFV is here, a term totally abused in our industry. NFV is the redesign of network control and data plane completely. The impacts are huge not just on network design, but more importantly on people, culture and processes. NFV is not just virtualization, so yes there are the TCO benefits but standard IT/Server virtualization tools and instrumentation do not cover what is required from an NFV FCAPS point of view. This talk will capture some of the salient requirements that need to be tackled in the area of wireless vEPC and vCDN.
Biography: Dr. Ibrahim Gedeon is one of the global telecommunications industry’s most colorful and respected executives, Dr. Gedeon has carved out an international career by combining tremendous insight and skill as an applied scientist with a lighthearted and thoroughly non-conventional approach to leadership. As Chief Technology Officer for TELUS, a leading national telecommunications company in Canada, his energy and engaging manner as a speaker and presenter are legendary within industry circles.
Dr. Gedeon began his career in telecommunications engineering and research in 1990 when he joined Bell Northern Research, designing signal-processing software in the Cryptographic Systems division. He moved to Nortel Networks in 1994 as a network design engineer.
Dr. Gedeon was named Vice President and Director of Data Network Engineering at Nortel in 1996; Vice President of Internet Brand Management in 1999, and Senior Vice President of Wireless Engineering in 2000. He joined TELUS in 2003 and is responsible for technology strategy, service and network architecture, service delivery and operational support systems for the company’s wire line and wireless divisions, as well as service and network convergence, enterprise applications and network infrastructure strategies and evolution.
Dr. Gedeon has held numerous leadership roles in the Institute of Electrical and Electronics Engineers (IEEE) and has also received numerous professional awards and various forms of industry recognition, including being named four times to the Global Telecoms Business magazine’s “GTB Power 100,” a list of the 100 most powerful and influential people in the telecoms industry.
He is currently the Honorary Co-Chair, IFIP/IEEE International Symposium on Integrated Network Management (IM 2015) in Ottawa, Canada, the Co-Chair of the IEEE Vehicular Technology Conference 2014 (VTC 2014) in Vancouver, and was the General Chair for the 2012 IEEE International Conference on Communications (ICC 2012) in Ottawa, Canada. He also serves on the board of a number of industry associations, including the Centre of Excellence in Next Generation Networks (CENGN)/Centres of Excellence for Commercialization and Research (CECR), Canada, the Next Generation Mobile Networks (NGMN) Alliance, the Washington, DC based Alliance for Telecommunications Industry Solutions (ATIS), and the University of Southern California’s Marshall School of Business based Institute for Communication Technology Management (CTM).
A native of Lebanon, he has a Bachelor's degree in electrical engineering from the American University of Beirut and a Masters' in Electronics Engineering from Carleton University. In 2010, he received a Honourary Doctor of Laws degree from the University of British Columbia.
Also renowned as a chef and gregarious host, his initial foray into the publishing world took the form of a cook book entitled From the Heart, released in 2006, which was followed by his much anticipated book on leadership in June 2011, entitled Weeding out the Wankers – Life Lessons from the workplace as seen by a Technology Executive, and a new cookbook called Spirited Curry from the Heart , released in December, 2012.
Keynote 2: Big Data in Science: The Good, The Bad, and The Ugly
Dr. Joseph L. Hellerstein
Senior Data Science Fellow,eScience Institute Affiliate Professor of Bioengineering, University of Washington, Seattle, Washington, USA
Abstract: Big data is changing science in fundamental ways. For example, prior to the wide spread availability of DNA sequencing technology, biology was largely a descriptive science. Now, it’s common for oceanographers and other biologists to quantify relationships between species based on the characteristics of their DNA, an analysis that may require hundreds of TBs of genome data. Social sciences are changing as well.
Historically, social scientists tested hypotheses using small scale experiments on college students. Today, analysis of data from Twitter and other social media allow for testing of hypotheses about real world social interactions at scale.
The potential of big data in science comes with challenges. Obvious examples of these challenges are version control, privacy, and sharing. The good news is that many of these considerations are well understood by IT managers. The bad news is that scientists often operate with low budgets and low IT skills and so traditional approaches to IT management are too complex and too costly. This is a pressing problem that has turned ugly. For example, it has been reported that top journals such as Science and Nature have a large number of articles with irreproducible results because the data used in computational studies was lost and/or the code doesn’t produce the results reported in the article.This talk will explore how software engineering practice and IT management can be adapted to support science in the 21st century.
Biography: Joseph L. Hellerstein is Senior Data Science Fellow in the eScience Institute and Affiliate Professor of Bioengineering, both at the University of Washington, Seattle, Washington. Previously, Dr. Hellerstein managed the Computational Discovery Department at Google (2008-2014), was a Principal Architect at Microsoft Corp. in Redmond, WA (2006 to 2008), and founded/directed the Adaptive Systems Department at the IBM Thomas J. Watson Research Center in Hawthorne, NY (1984 to 2006). Dr. Hellerstein received the PhD in computer science from the University of California at Los Angeles.
He has published approximately 200 peer-reviewed papers, 30 patents, and two books. He has taught at Columbia University and the University of Washington, and has served on numerous program committees and government advisory panels. Dr. Hellerstein is a recipient of the IEEE/IFIP Stokesberry Award, and is a Fellow of the IEEE.
Wednesday May, 13
Keynote 3: Data Science: The Analytics of Big Data
Dr. José M. F. Moura
Professor, Carnegie Mellon University, PA, USA
Abstract: Data is growing at a rate that closely follows Moore’s Law – the volume of existing data roughly duplicates every two years. Besides volume, other V’s characterize Big Data, variety, velocity, veracity, variability, value, visualization; it is also often unstructured and distributed. Variety is key, data arising from unconventional sources, social, business, urban, physical, biological, or molecular. Distributed in many settings, data is collected at thousands of points of sales, worldwide, or across the urban landscape by millions of opportunities, from taxi rides, to cameras, emergency calls, or demographics. Unstructured is often but a misnomer–data may not neatly fit a lattice– vector (time series) or table (image), but structured it is, indexed by social agents, genes, customers of service providers, or some other arbitrary enumeration suggested by the application. We will overview progress so far to reformulate the basic tenets of data and signal processing, now in the framework of Big Networked Data, illustrating with a number of applications ranging from social networks to the World Wide Web the power of reckoning with the underlying graph Big Data structure. Ours is an attempt to identify structure in unstructured data and theory and modeling in the “data deluge.”
Biography: José M. F. Moura is the Philip L. and Marsha Dowd University Professor at CMU, with interests in signal processing and data science. He holds 12 US patents and co-authored over 550 papers. He is 2016 IEEE VP for Technical Activities, was IEEE Board Director, President of the IEEE Signal Processing Society (SPS), and Editor in Chief for the Transactions on SP. Moura received the IEEE SPS Technical Achievement Award and the IEEE SPS Society Award. He is Fellow of the IEEE and of AAAS, corresponding member of the Academy of Sciences of Portugal, Fellow of the US National Academy of Inventors, and member of the US National Academy of Engineering.
Keynote 4: Harnessing a Petabyte: Opportunities and challenges for next generation ultra-scale data platforms
Dr. Richard J. Friedrich
Director, Systems Software for The Machine, HP Laboratories, Hewlett-Packard Company, USA
Abstract: The dawn of nonvolatile memory (NVRAM) is here. Rack-scale and data-center scale systems will soon support hundreds of terabytes to a petabyte of byte-addressable NVRAM. Next generation big data analytics applications will program and access this memory at LOAD and STORE memory latency instead of FREAD and FWRITE disc latency.
This creates new opportunities for multi-data set analytics as well as longitudinal studies over months and years of data. But what challenges will systems software and systems management face in light of systems of this scale? This talk will describe a few of the critical challenges including reliability and security and will illustrate possible technical approaches based upon HP’s next generation big data platform called “The Machine.”
Biography: The next decade will be dominated by sensing and sense making. The confluence of processing power, communication speed, and storage density has opened the doors to the textual, statistical and visual analysis of structured and unstructured data. Big data analytics has the potential to be one of the singular revolutionary innovations of the coming decade. Imagine easily deriving insights from the petabytes of documents, databases, images, sensors, audio and video in your organization.
Current systems cannot handle this challenge. However, our dual pronged approach, The Machine, a revolutionary new computing architecture, and Cognitive Computing, a platform for creating massively-parallel, cognitive applications that are easy to program, support adaptive learning, and scales to millions of cores on thousands of multi-core distributed processors. Cognitive computing, inspired by how the human brain works, has the potential to revolutionize visually-based information analytics, leading to exciting applications in a wide range of domains including context-based services, healthcare patient safety, fraud detection, surveillance, hydrocarbon discovery, sentiment analysis and visualization.
Recently, Friedrich spent four years as the Director of the Strategy and Open Innovation Office in HP Labs. Leading a global team, he is responsible for applying Open Innovation to amplify and accelerate research investments and technology transfer. In his strategic role he is responsible for research investments in nano-technology, exascale computing, cyber security, information analytics, cloud computing, 3-D immersive interaction, sustainability, social computing and commercial digital printing. HP’s Open Innovation Research Program is recognized as the only global, open, competitive innovation program that has established deep and impactful research collaborations between industry and academia. Recent successes include awards to 61 professors in 46 institutions in 12 countries.
Previously Friedrich directed the Enterprise Systems and Software Lab at HP Labs. The team’s research focused on ambitious next-generation enterprise computing and management systems, and invented distinctive cloud computing mechanisms that provide IT infrastructure and enterprise services on demand as well novel technologies that reduce data center power consumption by over 40%. These innovative results – demonstrated with the first cloud rendering service used by DreamWorks to render the movie Shrek 2 – reduced the total cost of ownership for HP customers while improving flexibility by automating IT operations for trusted, virtualized data centers.
Thursday May, 14
Keynote 5: From Software-Defined Infrastructures to Smart City Platforms
Dr. Alberto Leon-Garcia
Distinguished Professor, University of Toronto, Canada
Abstract: In this talk we explore the challenges in deploying software-defined infrastructures as platforms to support Smart City applications. We begin by describing our view of Software-Defined Infrastructure (SDI) and the role of integrated control and management of converged heterogeneous resources. SDI enables programmability of infrastructure by enabling the support of cloud-based applications, customized network functions, and hybrid combinations of these. We motivate SDI in the context of a multi-tier cloud that includes massive-scale data centers as well as smart converged network edges and virtualized access networks. We describe the SAVI Canada-wide testbed for application platforms and its Janus integrated resource management system. Next we describe socio-economic challenges that motivate the notion of a Smart City. We explore applications that can address these challenges and we discuss the possible role of SDI in supporting these Smart City applications. We also consider the relevance of SDI management systems to the design of the resource management systems in Smart City contexts.
Keynote 6: Micro Cloud: Moving Computing to Data to Deal with Data Management and Security Issues for Enterprise Clouds
Dr. Dinesh C. Verma
IBM Fellow and Department Group Manager of the Cloud based Networks Area at IBM T J Watson Research Center, USA
Abstract: Micro-Cloud technology is a new model for cloud computing especially suited for organizations that are unable to move data onto the cloud due to insufficient bandwidth, latency, location-specific processing needs, big data, security, or compliance reasons. Micro-cloud allows organizations to realize the benefits of cloud computing as they create new insights from their premises by moving computations and analytics to where the data resides, dynamically and intelligently.
In this talk, we will examine the challenges that large enterprises face with cloud computing, and how micro cloud technology can help alleviate them, thereby enabling large enterprises to get many of the benefits from cloud computing.