If you’re reading this you probably feel it too: data, analytics and AI aren’t just here — they’re now overly hyped as business-critical. Whether you’re in manufacturing, retail, finance or public services, the question has shifted from “Should we do analytics?” to “How do we do it better than the past without cleansing?”
In that light, I want to walk you through four big frameworks/players and show how they can help you (or your clients) turn “lots of data + big ambition” into “real business outcomes”. We’ll cover:
SAP’s new relational foundation model (from TechEd)
Palantir and its operational-analytics workflow platform
Databricks and its lakehouse/data + AI engine
Snowflake and its cloud data/AI platform
For each I’ll highlight what it is, why it matters, and how you might use it (with concrete use-cases). Let’s dive.
At SAP TechEd 2025, SAP announced a big leap: a foundation model called SAP RPT‑1 (Relational Pre-Trained Transformer 1) which is not about generating words, but about predicting business outcomes from relational/tabular data.
Most foundation models (LLMs) work on text; SAP is explicitly positioning RPT-1 for structured business data — that means things like sales orders, delivery delays, payment risk.
If you have lots of enterprise-system data (ERP, CRM, supply chain), RPT-1 promises faster, higher-quality predictions without training hundreds of narrow models.
SAP emphasised that “AI is nothing without well-organised data” and that the foundation is the data fabric.
Delivery-delay risk prediction: If you’re a manufacturer or distributor, you likely have tables of orders, shipments, carrier status, inventory. Feed that into RPT-1 to flag orders likely to be delayed so you can proactively notify customers or reroute logistics.
Payment risk / collections: B2B firms often struggle with which invoices will go unpaid or become late. Using RPT-1 on historical payment/credit data you can prioritise collections efforts where risk is highest, improving cash-flow and reducing write-offs.
Sales-order completion / upsell: For a company selling multi-item orders, you might predict which orders will complete vs which get cancelled or reduced, allowing sales teams to intervene or adjust expectations.
Instead of analysts spending weeks building separate predictive models for each business question, you have a single foundation model that understands relational data in your context. That means faster insights, fewer bespoke tools, and more time for applying what you learn rather than building the pipeline.
Palantir Technologies’s core strength is in operational analytics: bringing together disparate data, building workflows around them, and enabling decision-makers to act. Their documentation is rich with concrete use-cases.
Many organisations struggle not because they lack data, but because their data is siloed (ERP, supply chain, CRM, manufacturing). Palantir’s platform (Palantir Foundry) helps integrate legacy systems, harmonise data, and wrap analytics into operational workflows.
It is built for action — not just dashboards. The idea is: data → insight → workflow → decision.
Good governance, ontology (common business objects) and scalability are baked in, which is critical for large enterprises.
Optimising production & COGS: A consumer-goods giant used Foundry to integrate 7+ ERP systems, built a digital twin of its value chain, and began optimising raw-material purchases and production formulations — saving tens of millions.
Campaign management in healthcare: One case: a healthcare provider used Palantir to segment members (based on claims, prescriptions, demographics), run integrated campaigns, and achieved a 1.6× increase in vaccination rates.
Pricing / margin optimisation: In the chemicals sector, Palantir enabled sales & marketing to synchronise CRM data with Foundry, propose quotes with embedded margin analytics and accelerate deal closure.
What I love about Palantir’s story is that it flips a common frustration: “We have tons of data but it’s unusable” to “Now we can use our data to operate smarter”. It’s not just about pulling numbers — it’s about enabling the people in plants, supply-rooms, customer-service or marketing to act, based on real-time data and workflow. If your client is tired of “reports that arrive too late” or “we can’t integrate legacy ERPs”, this is exactly the kind of solution they’ll relate to.
Databricks is well positioned for companies that want unified data engineering, analytics, ML and streaming in one platform. Their “Data Intelligence Platform” has many real-world use cases.
Many firms have a split architecture: one system for analytics, another for machine-learning, another for streaming. Databricks offers a consolidated “lakehouse” approach (data lake + warehouse) which simplifies operations.
Real-time/stream use cases: if your client needs streaming ingestion, very large data volumes, ML + analytics in one place — Databricks shines.
Collaboration: data scientists, engineers, analysts all can work in the same platform, reducing hand-offs.
Streaming + predictive ops: For example, a manufacturing/engineering firm processing terabytes of real-time data daily to optimise operations and reduce risk.
Customer-360 + personalization: A retailer using Databricks to process data across stores, build personalized experiences, and run machine-learning models in production.
Unified orchestration of data & ML: Instead of separate ETL pipelines, model training environments and BI tools, Databricks offers unified pipelines.
Clients often say: “We have data scientists, we have dashboards, but they don’t talk to each other.” With Databricks you’re better positioned to say: “Let’s build your data-engineering house, then analytics and deployable models live in one platform.” That means faster time-to-value, fewer silos, and potentially more impactful change.
Snowflake Inc. continues to be a trusted platform for companies seeking a cloud-native data warehouse/lakehouse with strong governance, ease of use and broad ecosystem support. There are many strong use cases :
Simplicity and speed: For organisations where BI, analytics and data sharing across teams is the main need (rather than heavy streaming + ML), Snowflake is often a gentler entry point.
Multi-structured data: Snowflake supports semi-structured data (JSON, Parquet) alongside structured, letting organisations bring more diverse data into analytics.
Data sharing & governance: Especially for cross-company data sharing (e.g., suppliers, partners) Snowflake offers secure ways to do that without copying data. (Medium)
Unified view across departments: One healthcare provider used Snowflake to bring EHR records + wearable-device JSON + physician notes, enabling analysis that previously required weeks of pre-processing.
Document-chatbot on large repository: In one case, an energy company with ~800,000 technical documents used Snowflake + RAG (retrieval-augmented generation) to let engineers ask “What’s the recommended torque for this turbine component?” and get instant answers.
Standard analytics for retail/finance: Snowflake used for seasonal sales analytics, rebate programs, predictive modeling of churn.
If a client says: “We just need to get our data house in order, unify data, empower our business teams with self-service, and maybe start with AI down the line” — Snowflake is a credible, fast-win option. It means less “IT overhaul” and more “business teams up and running”.
Here’s a friendly checkpoint you can use when advising clients:
Question
If the answer is “Yes” → Consider…
Do you have lots of structured business-/ERP data and want predictive models quickly?
SAP’s RPT-1 + SAP data fabric
Do you struggle with many legacy systems, need operational workflows (plants, supply chain, logistics) and want data-driven decisioning?
Palantir Foundry
Is your organisation building large-scale data pipelines, streaming ingestion, ML at scale, lakehouse style?
Databricks
Do you need analytics, self-service, cross-business-unit dashboards, data sharing, and a simpler cloud-based data platform?
Snowflake
Start small – but plan big: Pick a use-case with clear ROI (e.g., reduce late shipments by 10 %, reduce raw-material waste by 5 %) but ensure you design for scaling.
Get data governance & modelling right: All these platforms emphasise that data quality, ontology (business objects) and governance are foundational. Users complaining about “bad data” will hamper success.
Blend platforms if needed: These are not mutually exclusive. For example, SAP + Snowflake integration is already happening.
Focus on people and workflows, not just tech: The biggest value often comes when insights are acted upon. If you build a predictive model but business teams ignore it — you’ve missed the real part.
In 2025 and beyond, the question isn’t whether we should use data and AI — it’s how we turn them into value that people care about. The four frameworks above provide powerful options:
SAP’s RPT-1 for tabular-business-data predictive modelling
Palantir for operational decision-support across systems and workflows
Databricks for large-scale data + ML convergence
Snowflake for fast, cloud-native analytics and data sharing
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 12 | |
| 7 | |
| 5 | |
| 5 | |
| 5 | |
| 5 | |
| 4 | |
| 4 | |
| 3 | |
| 3 |