Other Projects

Smaller projects that use python or data analysis


Cat Water Intake Tracker

This project began as a practical solution for monitoring my cats' water consumption using Google Sheets. Over time, I expanded it into a lightweight data analysis tool to explore potential environmental factors that might influence their hydration habits.

  • Tracks refill dates and water amounts after each top-off
  • Calculates estimated consumption based on average duration between refills
  • Integrates with Google Apps Script to retrieve average temperature and humidity for each period
  • Visualizes correlations between environmental factors and water intake (none identified so far)

View the tracker in Google Sheets

Meraki Network IP Aggregator

Built a Python script to automate the retrieval of public IPs, ISP names, and locations (city) for Meraki networks across an organization. The script reads a list of networks from a CSV, queries the Meraki API (including VPN status and uplinks endpoints), and enriches the data using an IP geolocation API.

  • Used the Meraki Dashboard API and requests to gather device and VPN public IP info.
  • Parsed nested fields to handle cases where IPs were hidden in uplink data.
  • Cleaned ISP names using regex and geolocated IPs via ipinfo.io.
  • Exported results to a final CSV report including network name, public IP, ISP, and city.

This helped automate what was previously a manual network audit process.


Device Inventory Parsing and Analysis

I developed a Python script to parse an exported HTML table of device inventory data. The script extracts key information such as phone model, MAC address, IP address, and user description. Based on the IP address, each device is classified by location (e.g., internal, remote, or unknown).

The extracted data is exported to a structured CSV file, which is then analyzed to generate summary tallies. These tallies show the number of devices per model across different locations, offering clear insight into deployment patterns.

This project demonstrates my ability to clean, structure, and analyze semi-structured data from real-world systems using Python, BeautifulSoup, and standard libraries.

CSV Comparison and Change Tracking

In this project, I developed a lightweight Python workflow to monitor changes across recurring CSV exports. The goal was to track shifts in specific classification fields and identify newly introduced records between two time points.

  • Filtered relevant rows based on custom tags.
  • Generated simplified CSVs with only the required columns for tracking.
  • Created a derived classification column based on IP address patterns.
  • Built a script to compare current and previous datasets:
    • Detected records that shifted from one classification state to another.
    • Flagged new entries not found in the earlier dataset.
  • Exported results to a clean, actionable summary file.

This type of delta analysis is useful for highlighting trends, surfacing anomalies, and maintaining visibility into evolving datasets in a minimally disruptive way.

Latency Monitoring via Pinging

This project focused on gathering real-time network latency data across a list of devices. I built a Python script to read a CSV containing hostnames or IP addresses, ping each device, and capture the latency.

  • Read and iterated over a structured list of devices exported from existing systems.
  • Used standard networking tools to measure ping response times.
  • Appended latency data directly into the source file or output a clean summary for analysis.

This lightweight diagnostic tool helped quickly surface performance issues across distributed endpoints and was easily adaptable for regular network health snapshots.