Scream
challenge
challenge
Template Complexity: Different instruments (Indices vs. Baskets vs. Tickers) require strictly different upload templates. Using the wrong layout causes immediate system rejection.
The Overwrite Risk: We generate many files daily. A major pain point was accidentally saving a new file over an existing one with the same name, erasing previous work.
"Bloated" Source Data: Source files often arrive as massive raw dumps (via email or drive) full of noise. Finding the few relevant tickers hidden inside required manual "Text-to-Columns" and filtering every single time.
The Quarterly Nightmare: Once a quarter, we must re-upload everything. My manager used to spend hours manually opening and combining hundreds of past files into one master sheet.
Inconsistent Input Formats: The upstream team sends data in unpredictable formats (e.g., "ZTS USD" vs. "USD ZTS"). Since we can't control the source, this inconsistency constantly broke our Bloomberg (BBG) uploaders.
Duplicate Injection Errors: The target system rejects duplicate tickers with error messages. When uploading large batches, manually identifying and removing these "already existing" tickers from the error logs took hours.
solution
Context-Aware Generation: I designed modular macros for each instrument type. The system identifies if the input is an Index or Basket and automatically routes data to the correct logic, generating the specific template required.
Smart Versioning (
UniquePath): I wrote a custom function calledUniquePath. Before saving, it checks if the filename exists. If it does, it automatically appends a suffix (_1,_2, etc.) instead of overwriting, ensuring zero data loss.Automated Scrubbing: I built a "cleanup macro" that acts as a filter. It instantly processes raw files, performs text separation (Text-to-Columns), discards the noise, and isolates only the tickers we need to process.
Historical Batch Aggregator: For the quarterly re-upload, I designed a tool that loops through folders, parses dates directly from filenames to filter for a specific time range, and merges hundreds of workbooks into a single master file in seconds.
Ticker Normalization Engine: I built a conditional parsing logic to handle chaotic inputs. Whether the source says "USD ZTS" or "ZTS USD", the macro detects the pattern, strips the currency noise, and reconstructs it into the standard BBG format (e.g., "ZTS US Equity").
Intelligent Error Parsing: I engineered a feedback-loop macro. It parses the system's raw error message text, extracts the names of duplicate tickers into a memory Array, and uses
XLOOKUPto instantly identify and delete these pre-existing tickers from the upload queue.
Key Impact
I transformed a previously scattered, manual workflow into the team's first centralized automation suite, consolidating ad-hoc manual steps into a coherent framework. This overhaul drastically reduced duplicate cleanup time from hours to just one second via intelligent error parsing. Furthermore, the [UniquePath] function I developed not only eliminated data overwrite risks but established a new standard for audit compliance, becoming a reusable module now widely adopted by other internal projects.