The Computer Science ceremony will be held at the Durham Convention Center following the main University Commencement Ceremony at Wallace Wade Stadium. Masks are recommended.
There are many graph problems of the following form: Given a graph G, construct a graph H that preserves some information about G, while optimizing some property of H. Some examples include spanners, distance preservers, reachability preservers, shortcut sets, and hopsets. I will focus on two of these:
- A spanner is a subgraph H of G that approximately preserves distances while being as sparse as possible.
In his 2008 lecture "Fun With Binary Decision Diagrams", Donald Knuth called BDDs "one of the only really fundamental data structures that came out in the last 25 years". It is indeed impossible to overstate the impact of BDDs on the verification of industrial hardware circuits and software protocols. A BDD is a graph-based data structure to encode a boolean function of boolean variables. In the first half of this talk, we introduce a new variant, RexBDDs, that provably improves the efficiency of BDDs and can seamlessly replace ordinary BDDs in practical implementations.
Deep learning excels with large-scale unstructured data - common across many modern application domains - while probabilistic modeling offers the ability to encode prior knowledge and quantify uncertainty - necessary for safety-critical applications and downstream decision-making tasks. I will discuss examples from my research that bridge the gap between these two learning paradigms. The first half will show that insights from deep learning can improve the practicality of probabilistic models.
A cloud provider today provides its network resources to its tenants as a black box, such that cloud tenants have little knowledge of the underlying network characteristics. Meanwhile, data-intensive applications have increasingly migrated to the cloud, and these applications have both the ability and the incentive to adapt their data transfer schedules based on the cloud network characteristics. We find that the black-box networking abstraction and the adaptiveness of data-intensive applications together create a mismatch, leading to sub-optimal application performance.
In recent years, reinforcement learning algorithms have achieved strong empirical success on a wide variety of real-world problems. However, these algorithms usually require a huge number of samples even just for solving simple tasks. It is unclear if there are fundamental statistical limits on such methods, or such sample complexity burden can be alleviated by a better algorithm. In this talk, I will give an overview of my research efforts towards bridging the gap between the theory and the practice of reinforcement learning.
Modern machine learning (ML) methods commonly postulate strong assumptions such as: (1) access to data that adequately captures the application environment, (2) the goal is to optimize the objective function of a single agent, assuming that the application environment is isolated and is not affected by the outcome chosen by the ML system. In this talk I will present methods with theoretical guarantees that are applicable in the absence of (1) and (2) as well as corresponding fundamental lower bounds.
Advanced digital technologies rely on collecting and processing various types of sensitive data from their users. These data practices could expose users to a wide array of security and privacy risks. My research at the intersection of security, privacy, and human-computer interaction aims to help all people have safer interactions with digital technologies. In this talk, I will share quantitative and qualitative results on people’s security and privacy preferences and attitudes toward technologies such as smart devices and remote communication tools.
Packet loss rate in a broadband network is an important quality of service metric. Previous work that characterizes broadband performance does not separate packet loss caused by physical layer transmission errors from that caused by congestion. In this work, we investigate the physical layer transmission errors using data provided by a regional cable ISP. The data were collected from 77K+ devices that spread across 394 hybrid-fiber-coaxial (HFC) network segments during a 16-month period. We present a number of findings that are relevant to network operations and network research.
Machine learning (ML) is widely used today, ranging from applications in medicine to those in autonomous driving. Across all these applications, various forms of sensitive information is shared with the ML model, such as private medical records, or a user’s location. In this talk, I will explain what forms of private information can be learnt through interacting with the ML model. In particular, I will discuss when ML model parameters in cloud deployments are not confidential, and how this can be remediated.
As computer systems grow more and more complicated, various performance optimizations can unintentionally introduce security vulnerabilities in these systems. The vulnerabilities can lead to user information and data being compromised or stolen. Many of the computer processor optimizations often focus on sharing or re-using the processor hardware between different users or programs. This can lead to different types of timing-based security attacks where the sharing or re-using of hardware components influences the timing of the operations performed on the processor.
High-speed RDMA networks are getting rapidly adopted in the industry for their low latency and reduced CPU overheads. To verify that RDMA can be used in production, system administrators need to understand the set of application workloads that can potentially trigger abnormal performance behaviors (e.g., unexpected low throughput, PFC pause frame storm). We design and implement Collie, a tool for users to systematically uncover performance anomalies in RDMA subsystems without the need to access hardware internal designs.
The advent of quantum computers places many widely used cryptographic protocols at risk. In response to this threat, the field of post-quantum cryptography has emerged. The most broadly recognized post-quantum protocols are related to lattices. Beyond their resistance to quantum attacks, lattices are instrumental tools in cryptography due to their rich mathematical structure. In this talk, I will present my work on understanding the complexity of lattice problems and on constructing lattice protocols useful in practical scenarios.
Machine learning models are composed of simple primitives such as matrix multiplication and non-linear transformations. Studying and improving these primitives is critical to advancing the capabilities of ML models: for example, the advent of powerful transformations such as convolutions and self-attention led to breakthroughs in deep learning. However, existing techniques have many drawbacks, including computational inefficiency and difficulty modeling long context.
Modern computers run on top of complex processors, but complexity is the worst enemy of security. Scientists and engineers have consistently tried to develop secure software systems for decades. However, my work shows that new classes of vulnerabilities in complicated processors can break the security guarantees provided by software systems, cryptographic protocols, and privacy technologies. In this talk, I will give an overview of my work on discovering, evaluating, and mitigating such vulnerabilities. First, I will talk about side-channel attacks on cryptographic implementations.
Triangle Computer Science Distinguished Lecturer Series
Speaker Name
Marilyn Wolf
Location
Hybrid: Zoom (https://unc.zoom.us/j/96219713488) OR 011 Sitterson Hall / Dept of Computer Science / UNC-CH / 201 S. Columbia St. Chapel Hill, NC 27599-3175
Date and Time
-
Perception is a critical computational task in autonomous vehicles. Autonomous vehicles place stringent and somewhat conflicting demands on perception systems: high accuracy, low latency, and performance on limited computational resources. The conflict between these requirements is particularly acute in the case of unmanned aerial vehicles (UAVs) but is also true of ground vehicles. This talk will describe two related efforts to improve and manage efficiency in perception for autonomous vehicles. Work with Krishna Muuva and Justin Bradley of UNL looks at UAV-UAV tracking.
Differential privacy has become a de facto standard for extracting information from a dataset while protecting the confidentiality of individuals whose data are collected. It has been increasingly adopted in industry and the public sector. Crucial to any differentially private system is a set of privacy mechanisms, the building blocks of larger privacy-preserving algorithms. Those privacy mechanisms inject randomness into non-private computations in order to ensure privacy protections.
There is no doubt that robots will play a crucial role in the future and need to work as a team in increasingly more complex applications. Advances in robotics have laid the hardware foundations for building large-scale multi-robot systems, such as for mobile robots and drones.