Readit News logoReadit News
vipshek · 5 months ago
I don't have much to say about this post other than to vigorously agree!

As an engineer who's full-stack and has frequently ended up doing product management, I think the main value I provide organizations is the ability to think holistically, from a product's core abstractions (the literal database schema), to how those are surfaced and interacted with by users, to how those are talked about by sales or marketing.

Clear and consistent thinking across these dimensions is what makes some products "mysteriously" outperform others in the long run.

skydhash · 5 months ago
It's one of the core ideas of Domain-Driven Design. In the early stage of the process, engineers should work closely with stakeholders to align on the set of terms (primitives as another commenter has put it), define them and put them in neat little contextual boxes.

If you get this part right, then everything else becomes and implementation effort. You're no longer fighting the system, you flow with it. Ideas becomes easier to brainstorm and the cost of changes is immediately visible.

Denzel · 5 months ago
Immediately thought DDD too!

DDD suggests continuous two-way integration between domain experts <-> engineers, to create a model that makes sense for both groups. Terminology enters the language from both groups so that everyone can speak to each other with more precision, leading to the benefits you stated.

pavel_lishin · 5 months ago
> If you get this part right

And yet it's so easy to get wrong.

We ended up with something like five microservices - that, in principle, could have been used by anyone else in the company to operate on the Domains they were supposed to represent and encapsulate. This one holds Users and User data! This one holds Products, and Product interaction data!

Nobody touched any of those except us, the engineers working on this one very specific product. We could have - should have - just put it all on one service, which would have also allowed us to trivially run database joins instead of having to have services constantly calling each other for data, stitching it together in code.

Sigh.

tharkun__ · 5 months ago
Now how do you get your company / do yourself the hiring of those people in such a way that you can basically just have a team of people like this work with PMs to build their ideas?

I like doing this FS journey myself but am stuck "leading teams" of FS/BE/FE mixes and trying to get them to build stuff that I clearly understand and would be able to build with enough time but all I have is a team of FE or BE people or even FS people that can't just do the above. You need to be very involved with these people to get them to do this and it just doesn't scale.

I've recently tried AI (Claude specifically) and I feel like I can build things with Claude much quicker than with the FE/BE/FS people I have. Even including all the frustrations that Claude brings when I have to tell it that it's bullshitting me.

Is that bad? How do you deal with that? Advice?

xp84 · 5 months ago
Yes, and conversely, in cases when the initial model misjudged future needs, the most disastrous projects are those where the requirements or the technical design flies in the face of the original model. When this is solved sloppily, this often begins the slow degeneration - from an application that makes sense to a spaghetti mess that is only even navigable by people who were around when those weird bolt-ons happened. Usually not only the code, but also the UI reflects this, as even massive UI overhauls (like Atlassian's in 2025) tend to just sweep everything awkward under a rug -- those things are still necessary to manage the complexity but now they're hidden under ••• → Advanced Settings → Show More → All Settings.
ToucanLoucan · 5 months ago
I don't suppose you have any tips on how to get this going in an org? I love where I work and I love the products we make, but my team (phone apps) are treated very often like an afterthought; we just receive completed products from other teams, and have to "make them work." I don't think it's malicious on the part of the rest of the teams, we're just obviously quite a bit smaller and younger than the others, not to mention we had a large departure just as I arrived in the form of my former boss who was, I'll fully admit, far more competent in our products than I am.

I've worked on learning all I can and I have a much easier time participating in discussions now, however we still feel a bit silo'd off.

yazmeya · 5 months ago
> sales or marketing

Also operations and customer support. They are your interface to real, not hypothetical, customers.

esjeon · 5 months ago
> The companies that win won’t be those with the most or even the best features. AI will democratize those. The winners will be built on a data model that captures something true about their market, which in turn creates compounding advantages competitors can’t replicate.

And that's why I think the future of the software industry is data-driven, and we will end up with another GNU-like movement around free and open data models/schemas. I think we already have a good starting point: Linked Data[1] and schema.org[2]

[1]: https://www.w3.org/wiki/LinkedData

[2]: https://schema.org/

bayindirh · 5 months ago
Open Science folks understood this fact around 2018 IIRC, and there are a couple of nice standards for encapsulating research data such as RO-Crate [0].

Moreover, the science folks are not a picky bunch and they tend to use what works well, whether it be CSV and XML. As long as there's tooling and documentation, everything is acceptable, which is something I like.

[0]: https://www.researchobject.org/ro-crate/

kitd · 5 months ago
This was also the aim of RDF and the various metadata schemas like Dublin Core, to standardise ontologies for marking up knowledge.
Jgrubb · 5 months ago
I totally agree and would like to shill for the FOCUS project - https://focus.finops.org/focus-specification/ - which is an open source project to normalize and standardize the billing format of cloud vendors and Saas vendors alike. It brings greater transparency and efficiency to understanding that massive cloud bill your company pays every month.

I've used this schema to merge together AWS, GCP, and Azure into 1 unified cloud bill, which unlocks a ton of understanding of where the money is going inside the cloud bills.

Deleted Comment

kristianc · 5 months ago
There's a term for this - inventing a new primitive. A primitive is a foundational abstraction that reframes how people understand and operate within a domain.

A primitive lets you create a shared language and ritual ("tweet"), compound advantages with every feature built on top, and lock in switching costs without ever saying the quiet part out loud.

The article is right that nearly every breakout startup manages to land a new one.

AdieuToLogic · 5 months ago
Another industry term for this is defined in the Domain Driven Design world as a domain's "ubiquitous language"[0]:

  These aspects of domain-driven design aim to foster a 
  common language shared by domain experts, users, and 
  developers—the ubiquitous language. The ubiquitous language 
  is used in the domain model and for describing system 
  requirements.
0 - https://en.wikipedia.org/wiki/Domain-driven_design#Overview

skeezyjefferson · 5 months ago
"A primitive is a foundational abstraction that reframes how people understand and operate within a domain." is good but "and lock in switching costs without ever saying the quiet part out loud." is just chefs kiss
sethammons · 5 months ago
I call it lego pieces. We want to enable teams to compose useful units together; to enable builders (generally internal teams) to build things with a clear mental model. "Primitives" are the same: base unit of abstraction for the domain.
skeezyjefferson · 5 months ago
i think youre actually serious but this is excellent satire
AdieuToLogic · 5 months ago
For me, this is a "near miss" in that the data model is an implementation detail. Instead, the subtitle identifies where the value resides:

  Your product's core abstractions determine whether new 
  features compound into a moat or just add to a feature list.
Which is captured by the Domain Model[0]. How it is managed in a persistent store is where a data model comes into play.

See also Domain Driven Design[1].

0 - https://en.wikipedia.org/wiki/Domain_model

1 - https://en.wikipedia.org/wiki/Domain-driven_design

android521 · 5 months ago
there is a subtle difference, it is not just domain driven desgin. It is basically trying to innovate a new way to think about in an existing domain (eg docs vs blocks in note taking). ~ "Your data model is your destiny. The paradox is that this choice happens when you know the least about your market, but that’s also why it’s so powerful when you get it right. Competitors who’ve already built on different foundations can’t simply copy your insight. They’d have to start over, and by then, you’ve compounded your advantage.".
bonesss · 5 months ago
You’re describing core features of Domain Driven Design.

Innovating, evolving, creating, and capturing new domain concepts to create Blue Ocean solutions inside and outside the Enterprise. Iterating on core concepts, via subject matter expert led/involved discussions and designs, and using new concepts to better articulate the domain. Managing that change over time and accounting for ontological and taxonomical overlap versus Enterprise System development needs.

That’s the foundation that can actively copy insights, and doesn’t rely on Immaculate Specification or premature data modelling. No need to start over, thanks to clearly separated concerns.

Note: copying an insight is a far cry from having the wherewithal to make that insight, there are numerous downstream benefits to articulating your business domain clearly and early.

AdieuToLogic · 5 months ago
"Data model" is a software engineering term of art[0] which identifies artifacts specific to managing the persistent representation of information relevant to a system's operation. This representation is often a different, simplified, version of what a system uses internally to define and provide its value.

> It is basically trying to innovate a new way to think about in an existing domain ...

Note that a domain is not the same as a domain model.

> "Your data model is your destiny."

This is why I consider the article a "near miss." If the above quote from the post was instead "your domain model is your destiny", the subsequent quoted statements not only would need no alteration but would substantiate the topic at hand being domain modeling and the organizational value found therein.

0 - https://www.merriam-webster.com/dictionary/term%20of%20art

lysecret · 5 months ago
I agree was about to comment that this article fits a domain model more than a data model.
bob1029 · 5 months ago
A well engineered data model can also be used as the basis for a business rules engine. This is popular in enterprise environments that use technology like oracle db or mssql. It is possible to implement all the core business logic as stored procedures and functions. These can be directly invoked from something like a web server. Instead of putting all the session validation logic in backend code, it could live in PL/SQL, T-SQL, etc.

The benefit to having the logic and the data combined like this is difficult to overstate. It makes working in complex environments much easier. For example, instead of 10 different web apps each implementing their own session validation logic in some kind of SSO arrangement, we could have them call a procedure on the sql box. Everyone would then be using the same centralized business logic for session validation. Any bugs with the implementation can be fixed in real time without rebuilding any of the downstream consumers.

sethammons · 5 months ago
Counter point: spooky code at a distance is bad. Splitting your code to live partially in source control and partly in the database means keeping multiple layers in sync. This is coupling, and coupling multiple things, especially if that means teams, together means increased overhead.

I have seen business rules as stored procedures lock a business into their current model across with a dozen teams, effectively making system improvements impossible. And because they needed some olap atop oltp in some cases, their very beefy postgres solution crawled down to a max of 2k queries per second. I worked with them for over a year trying to pull apart domain boundaries and unlock teams from one another. Shared, stored procedures was a major pole in the tent of things making scaling the technical org incredibly hard.

Repeat after me: uncoupled teams are faster.

ryanrasti · 5 months ago
+100 to you both. This is the classic tradeoff: powerful, centralized DB logic vs. clean but often anemic app code.

I'm building Typegres to give you both. It lets you (a) write complex business logic in TypeScript using a type-safe mapping of Postgres's 3000+ functions and (b) compiles it all down to a single SQL query.

Easier to show than tell: https://typegres.com/play

skeezyjefferson · 5 months ago
was the only hammer you had SQL shaped? SQL as an SSO solution is new one to me
cyberax · 5 months ago
A veritable thread with bad advice!

Deleted Comment

dkarl · 5 months ago
This is an application of an engineering term to a product-level concept, but it fits. I guess you'd say "domain model" in product-speak, but to my engineering brain it doesn't evoke the cascading consequences of the model for the rest of the system. It's a rare product manager who treats the domain model as a consequential design product and a potential site of innovation.
jamesblonde · 5 months ago
I was expecting a discussion of the foundations of data modelling: star schema vs snowflake schema data models vs one big table. The benefits of 3NF vs when you have to denormalize.

This underlying choice of data model actually does define your destiny. What I think the author was thinking of is domain modelling and correct entity identification, which is also important. It's a layered approach - and if you ignore the foundations (the actual data model), you hit limitations higher up.

For example, in real-time AI systems, you might want users to provide a single value (like an order number) to retrieve precomputed features for a model. With Snowflake Schema data models, it works. But for Star Schema data models, you have to provide entity IDs for all tables containing precomputed features - which leads to big problems (the need for a mapping table, a new pipeline, and higher latency).

Reference: https://www.hopsworks.ai/post/the-journey-from-star-schema-t...

pegasus · 5 months ago
I prefer your terminology. That being said, domain modelling (what the article describes) comes first, hence is more foundational and important than data modelling.
majke · 5 months ago
I totally agree. Early days Cloudflare was a great example of this. We treated IP addresses as data, not as configuration. New subnet? INSERT INTO and we're done. Blocked IP? DELETE FROM, and tadam. This was a huge differentiator from other CDN's and allowed us extreme flexibility. The real magic and complexity was with automatic generating and managing HTTPS certs (days before SNI).
vecter · 5 months ago
Can you explain more? I don’t understand the distinction in this case between data and configuration in the context of IP addresses.
majke · 5 months ago
In simplest scenarios software is not aware of the IP space. Like you bind to 0.0.0.0:443 and move on.

In more sophisticated configs adding / removing IP's or TLS certs requires restarting server, configuring applications. This gets out of hand quickly. Like what if your server has primary IP removed, because the IP space is recycled.

At CF all these things were just a row in database, and systems were created to project it down to http server config, network card setting, BGP configurations, etc. All this is fully automated.

So an action like "adding an IP block" is super simple. This is unique. AFAIK everyone else in the industry, back in 2012, was treating IP's and TLS more like hardware. Like a disk. You install it once and it stays there for the lifetime of the server.

dgb23 · 5 months ago
Not OP but I think the insight was to treat them as first class objects that are interacted with directly. The implementation itself seems secondary.