Redesigning a data-intensive cloud platform for geoscientists to visualize, interpret, and collaborate on subsurface assets.
i2G is a high-performance, cloud-based SaaS platform for geoscientists and operators in the energy sector—a hub to visualize, interpret, and manage a massive OSDU-backed catalog (144,000+ objects spanning wellbore logs, seismic surveys, and related metadata), often backed by large files on object storage.
The legacy experience behaved like a database-shaped product: dense grids, deep nesting, and disconnected views that made it hard to orient in geospatial reality before diving into row-level technical fields (API numbers, operator details, trace data, water depths, and more).
The mission was to transform that workflow into a modern, responsive, map-first interface—without sacrificing analytical depth—while keeping pace with a zero-to-one build where design and production code needed to move in tight loops toward a live Vercel-deployed MVP.
Information overload without context—users were bombarded with disjointed tables before they had geospatial anchoring or a clear progressive path from map to metadata.
Execution velocity vs. enterprise complexity—traditional handoffs were too slow for the speed required to ship interactive, stakeholder-ready product surfaces in the browser.
We approached the redesign by first understanding geoscientists’ mental models and the OSDU-shaped data model behind the product. The solution was not a surface-level reskin—it was a restructuring of how data is discovered and consumed. We implemented map-first entry, contextual sidebars, and progressive disclosure so users see the most critical context first, with deeper metadata and file-level detail only a deliberate step away—carried through consistently across well, seismic, and rig domains in a shared component language.
An interactive global map anchors users in geospatial reality before they ever touch a dense grid. Selecting a well or seismic pin opens a contextual right-hand sidebar—platform ID, water depth, operator, target status—without breaking spatial awareness. The same patterns extend across Well and Seismic domain views using one scalable system: clean grids, pill-tag taxonomy, and monospace IDs that generalize across wellbore logs, seismic surveys, and rig data.
From the map, users move through a deliberate macro → micro path: a focused metadata modal (General, Data, Permissions, Legal) to reduce clutter, then file-level log lists, then full-screen interactive plots for dense technical measurements—turning raw traces into interpretable signal instead of endless tables.
Beyond traditional filtering alone, an “Ask AI” surface lets geoscientists query in natural language (for example, summarizing metadata for a named well). The experience is designed as a pervasive, fast-invoked assistant layer that complements—not replaces—the core analytical UI.
Two operationally critical surfaces: a global search index spanning 144,000+ OSDU objects to eliminate dead-end navigation, and an executive dashboard that consolidates the data space into a single command view—object counts, pipeline activity, storage utilization, and ingestion health for operators who need a live pulse without hopping across domains.
Enterprise trust shows up in the long tail: profile identity, notifications for processing and maintenance events, security (password + 2FA), multi-device session management, backups and import/export, appearance (including dark mode for shift-based work), and system diagnostics—each screen treated with the same hierarchy and clarity as the core product.
Mapped 45+ unique workflow paths for seismic interpretation and data curation.
Created 200+ reusable UI components in Figma with strict auto-layout and variant logic.
Developed click-through prototypes for stakeholder validation and dev handoff.
Engineered a custom color palette specifically for geological log readability.
Reduction in time spent searching for wellbore assets across global repositories.
Successfully migrated legacy desktop logic into a world-class browser experience.
Mapped and navigable through a unified map, domain, and search experience.
Production-ready React MVP shipped for stakeholder review—not static concept screens alone.
Design systems moved directly into implementation with tight design-to-dev iteration.
Natural-language assistance layered across the product for faster answers to domain questions.
"Jastej took our complex legacy platform and transformed it into something our team actually wants to use. The AI chat feature was a game-changer — our geoscientists can now query data naturally instead of navigating through endless menus. And seeing it live dynamically via Vercel during reviews completely changed our process."
Experience the actual platform