Astrolabe Analytics
Organization and Analysis of Battery Data from Heterogenous Sources
In the rapidly evolving field of energy storage, battery technology plays a crucial role in enabling sustainable solutions. Companies working with batteries often collect vast amounts of data from various sources, such as cell-level and pack-level measurements, cycling tests, and open-circuit voltage experiments. However, this data is typically heterogeneous, originating from different equipment, formats, and naming conventions. The lack of standardization poses significant challenges: • Data Inconsistency: Varying column names, units, and formats make it difficult to aggregate and compare datasets. • Inefficient Data Processing: Data scientists spend considerable time manually cleaning and transforming data before analysis. • Delayed Insights: Slower data processing leads to delays in deriving actionable insights, hindering timely decision-making. These challenges underscore the need for a streamlined process to collect, standardize, and store battery data efficiently. An effective solution would enhance data usability, reduce manual effort, and accelerate the analysis pipeline. This student team will work to develop a web-based portal designed to facilitate the seamless upload and standardization of battery data from diverse sources. The project will encompass both front-end user interface design and back-end data processing, culminating in a functional prototype. The key components of the solution this student team will work to achieve are: 1) User-Friendly Web Portal Potential Flow: Astrolabe Analytics provides access by sending customers a link to access the platform and provide their credentials or basic information such as name, company, and associated project. Once logged in, users can proceed to the data type selection, where they choose the kind of data they wish to upload from predefined categories like discharge data, cycling data, or open circuit voltage data, or specify ‘Other’ for additional types. The portal features an intuitive upload process, supporting drag-and-drop functionality that allows users to efficiently upload multiple files at once. After the upload, the data is tagged with relevant metadata to ensure proper metadata association, linking each dataset to the correct customer and project within the database. 2) Backend Data Management The backend should have the following features: i) Data Standardization Pipeline (can be done in collaboration with Astrolabe Analytics' Lead Data Scientist) a. Automated Transformation: Implement scripts to standardize column names, units, and data formats based on predefined conventions (e.g., using snake_case for column headers like voltage_v, current_a). b. Column Mapping Interface: For datasets with unconventional formats, provide an interface for users to map their columns to standard names during the upload process. c. Error Handling: Detect and flag inconsistencies or missing data for manual review. ii) Data Cataloguing: Standardized datasets are organized and stored, ready for immediate use by data scientists. iii) Cloud Integration - Uploaded files are securely stored on Amazon Web Services (AWS), ensuring scalability and reliability (can be done in collaboration with our Cloud Architect) The outcomes of the design phase that this student team will work to achieve are: • Develop the architecture for both front-end and back-end components. • Create a database schema to store metadata and standardized data. The outcomes of the implementation phase that this student team will work to achieve are: • Build the web portal with user authentication and data upload capabilities. • Develop the data standardization scripts and column mapping interface. • Integrate AWS for file storage and management (interfacing with our Cloud Architect for AWS deployment). The outcomes of the testing and results phase that this student team will work to achieve are: • Validate the system with sample datasets representing various data types. • Demonstrate the reduction in data processing time and improved data consistency. Finally, this student team will work to provide user manuals and technical documentation for future maintenance and scalability.
Faculty Adviser
David Beck,
Director of Research, eScience InstituteDirector, Scientific Software Engineering Center,
Related News

Fri, 09/20/2024 | UW Civil & Environmental Engineering
Smarter irrigation for a greener UW
A new project combines satellite data with ground sensors to conserve water and create a more sustainable campus environment.

Mon, 09/09/2024 | UW Mechanical Engineering
Testing an in-home mobility system
Through innovative capstone projects, engineering students worked with community members on an adaptable mobility system.

Mon, 08/19/2024 | UW Mechanical Engineering
Students strive to ensure accurate AED shock dosage
ShockSafe, developed by students with the help of mentors from Philips and Engineering Innovation in Health (EIH), can distinguish between children and adults during cardiac arrest emergencies.

Wed, 08/07/2024 | Snohomish County News
Snohomish County, University of Washington partnership boosts efficiency in enterprise scanning center
UW Industrial and Systems Engineering Capstone Project set to save Snohomish County over $40,000 annually.