Data tools
- Low-friction “try it” flows: fewer moving parts for a visitor than cloning a backend repo
- Typical surface: Streamlit (or similar) served over HTTPS on a small host or PaaS
- Shows buildable product slices; paid client work is described under Services
Both this page and Portfolio can list public URLs. The split is what you are evaluating: here, stakeholder-style interaction (charts, exports, guided steps); there, system boundaries, contracts, and batch or API-heavy work.
Services describes what I deliver under contract—dashboards, Sheets, Power BI, automation, and reporting cadence. These apps are portfolio demos, not the commercial menu.
GitHub profile — opens everything public on my account. On this page, each project card links to that repo and its live demo in the footer.
Apps in this track
Each app runs as its own small service (typically Streamlit behind TLS). Cards use the same layout as Portfolio: GitHub plus Live demo for every entry.
Data cleaning toolkit
Upload messy tables (CSV, Excel, Parquet, JSON; multi-file merge supported), review detected issues, apply rules, download cleaned data plus an HTML report that records what changed.
Emphasizes accountable mutation—useful when a client asks “what did you do to my file?” Demo scenarios in-repo illustrate realistic defects.
EDA report generator
Read-only profiling: types, missingness, duplicates, histograms, correlations, quick heuristics; optional PDF when the server has the right system libraries.
Complements the cleaning toolkit: profile first, fix elsewhere. Configurable caps and sampling keep large files honest in the report narrative.
AI data analysis app
Upload a business-style CSV and get a full insight pass in the browser: dataset profile, up to six Plotly charts with sensible defaults, rule-based anomaly and quality hints, then an optional OpenAI executive summary—built only from metrics and aggregates, never raw rows. Everything rolls into one downloadable HTML report for email or offline review.
Positioned for sales, marketing, and customer-behavior shaped tables (UTF-8 CSV; sensible row/size limits with guardrails in config). No API key? The report still runs; the AI section states what is missing instead of failing silently. Optional access limits for public demos. Containerized path with Docker Compose and VPS notes in-repo—same stack as the live app: pandas, Jinja2 templates, pytest.
KPI dashboard app
Single-page Streamlit dashboard for sales or marketing CSVs: KPI cards, trend and breakdown charts, optional period deltas, and a short “what changed?”—rule-based by default, with optional OpenAI that only ever sees pre-aggregated KPIs, never the raw file.
Auto column detect plus manual mapping and JSON presets for repeat schemas; explicit validation, CTR guardrails, and minimum-rows-per-period gates before numbers show. Export a timestamped snapshot folder and download it as a ZIP for handoffs. Docker-ready; pytest CI on GitHub. Deliberately narrow MVP—no warehouse connectors or enterprise RBAC.
Forecasting app
Univariate series with point forecast and intervals, backtests vs a naive baseline, explicit model status when history is too short or fit fails.
Honest MVP: no regressors or holiday calendars in v1—set expectations for buyers who want transparent methods over black-box hype.
These five apps are listed here with repository and live-demo links on each card. Commercial offerings and how engagements work are described on Services—not on this portfolio grid.