Skip to main content

John Baugh

Professor

Fitts-Woolard Hall 3163

Bio

John Baugh is a Professor of Civil Engineering and Operations Research at North Carolina State University. He received the Ph.D. from Carnegie Mellon University in 1989.

In addition to carrying out research and instructional programs, Dr. Baugh works broadly within the university to improve the educational opportunities and outcomes of its students. Included among those efforts is leadership in creating a new graduate program in the areas of computing and systems engineering, and previously serving as Director of the NC Japan Center.

Before pursuing the Ph.D., he worked as a Research Engineer in Applied Mechanics and Structures at Battelle’s Pacific Northwest National Laboratory. He is a member of ACM, INFORMS, Phi Kappa Phi, Tau Beta Pi, and Chi Epsilon.

Education

Ph. D. Civil Engineering Carnegie Mellon University 1989

M.S. Civil Engineering Carnegie Mellon University 1984

B.C.E. Civil Engineering Auburn University 1983

Area(s) of Expertise

Dr. Baugh's research interests include scientific computing and cyber-physical systems, formal methods for safe and trustworthy software, verification and validation, mathematical optimization and control, and applications across civil engineering including coastal, ocean, transportation, and structural systems.

Current and past funding sources include the US Department of Energy, National Science Foundation, Department of Homeland Security, Federal Transit Administration, US Environmental Protection Agency, and North Carolina Supercomputing Center.

Publications

View all publications 

Grants

Date: 10/01/21 - 9/30/24
Amount: $300,000.00
Funding Agencies: National Science Foundation (NSF)

It takes a delicate balancing act plus a significant amount of new research to develop advanced machine learning based systems that are correct at the user level (e.g., fair treatment of the images they classify) and run reliably and efficiently on modern hardware (consume less energy and do not contain software defects). The dependency chain is a balancing act between what exists and what will be researched and developed in this project. First, higher level correctness often calls for advanced network structures such as transformers. There is significant amount of ongoing research one can draw upon. Second, such network structures cannot be implemented as such, and must be suitably sparsified (or compressed to remove unimportant weights). This is because all fully trained "original" networks are, by nature, overparameterized (finding the minima of loss functions is ideally done a higher dimensional space with more weights). Keeping the redundant weights exacerbates FLOP as well as memory requirements, and therefore one must discard unimportant weights before deployment. There are many excellent compression systems??????????????????including our own system??????????????????one can use or extend. Third, pruning merely to achieve metrics such as top-1% classification accuracy are insufficiently incisive to guarantee semantic goals (e.g., fairness). New research is needed to prune in such a way that higher level semantic goals are also met. Fourth, when sparsified in interesting ways, networks exhibit sparsity patterns that are completely data driven. One cannot find these sparse structures in existing libraries or in familiar forms such as compressed sparse row (CSR), coordinate list (COO), or ELLPACK (ELL) formats. Therefore, new research is needed in mapping these "data driven" sparsity patterns efficiently onto state-of-the-art-and fast-moving software/hardware combinations. Fifth, the sparsity patterns desired (or demanded) by top-down analysis very often does not meet what can be efficiently realized in software/hardware. Finding these efficient patterns is something for which significant expertise is needed. We find that recommendations coming from "bottom-up" analysis are often accepted without question, as ML experts working at higher levels do not have the wherewithal to deviate. So finally, there must be mechanisms to formally describe these patterns either with the view of mapping them onto the best available patterns to available or to provide the starting specifications for correctly and efficiently realizing these patterns (say in custom CUDA code or hardware). By ensuring that the top-down-desired and the bottom-up-provided "meet in the middle," the proposed research will enable the development of advanced machine learning based systems that are correct at the user level and run reliably and efficiently on modern hardware.

Date: 10/01/16 - 12/30/20
Amount: $3,520,000.00
Funding Agencies: US Dept. of Energy (DOE)

This project will develop methodology for verification and validation of advanced computer models used in Risk-Informed Safety Margin Characterization (RISMC) of nuclear power plants. The project will apply the methodology to selected problems in nuclear reactor safety. The focus is placed on computer codes that simulate external hazards that have impact on plant safety, including severe accident management and emergency response. The project will collect, characterize, archive, and use data from plant measurements, integral-effect and separate-effect tests to support code validation. The methodology will bring to use probabilistic risk assessment (PRA) to guide a risk-informed validation approach implementation.

Date: 07/01/08 - 6/30/14
Amount: $2,766,873.00
Funding Agencies: US Dept. of Homeland Security (DHS)

The dynamic evolution of landforms under stress can lead to catastrophic loss of either functionality or of mass itself. This project will examine the dynamics of landforms undergoing a transition from one state to another (e.g., barrier island collapse, wetland loss, dune erosion) in order to determine critical defining features of the resilient natural and developed landforms. This descriptive dynamic will be translated into design parameters for restoration of protective or beneficial landforms (e.g., beaches, dunes, barrier islands, wetlands). In addition, this analysis will be used to provide improved metrics for communicating hazard and risk as well as incorporating hazard and risk into land use plans. This project lies at the interface between Coastal Hazards Science and Planning for Resilience focus areas and has the potential to provide insights to the Hazards, Human Behavior and Economic Resilience focus area.

Date: 09/30/09 - 9/30/13
Amount: $390,000.00
Funding Agencies: US Dept. of Homeland Security (DHS)

The purpose of this proposal is to establish a graduate research fellowship program to train students to be future leaders in the area of engineering of resilient civil infrastructure systems for coastal regions considering natural hazards. This program will be conducted in coordination with the ongoing DHS Center of Excellence on Natural Disasters, Coastal Infrastructure and Emergency Management.

Date: 12/01/04 - 4/30/06
Amount: $683,000.00
Funding Agencies: Blue Ridge Analytics, Inc.

The project will develop computer software to implement mathematical models designed to represent civil engineering infrastructure problems. In addition, the software will implement optimization techniques, such as mathematical programming and heuristic search methods. The techniques will be implemented so that alternative solutions, which meet given constraints on modeled objectives, will be obtained and provided to a user of the software.


View all grants