Readit News logoReadit News
bob1029 · 3 years ago
I believe the easiest way to think about it is to get away from your programming tools and to start modeling your problem domain as tables in excel.

Once you have a relational schema that the business can look at and understand, then you go implement it with whatever tools and techniques you see fit.

This is what “data-oriented” programming means to me. It’s not some elegant code abstraction. It’s mostly just a process involving people and business.

Even for non serious business, these techniques can wrangle complexity that would otherwise be insurmountable.

I still think the central piece of magic is embracing a relational model. This allows for things like circular dependencies to be modeled exactly as they are in reality.

jackosdev · 3 years ago
This confused me as when I hear data-orientated I think data structures that are optimised around minimising CPU cache misses by using better alignments, using enums where possible, not storing results of simple calculations etc. There is a popular book and popular talks on the subject. Probably confuses other people as well I'd imagine
bob1029 · 3 years ago
> I think data structures that are optimised around minimising CPU cache misses by using better alignments, using enums where possible, not storing results of simple calculations etc.

You might be surprised to learn that modeling your problem in terms of normalized relational tables ultimately achieves similar objectives. The more normalized, the more packed you will find their in-memory representations.

viebel · 3 years ago
alphanumeric0 · 3 years ago
I'm having a hard time thinking of a way code can ever be fully decoupled from data. When we decide it's better to have a name field rather than firstName and lastName does that mean we simplify NameCalculation.fullName to just return data.name? This seems to suggest we still have coupled code to data (the data structure being an object), it's just now a coupled function, but you have decoupled it enough to use NameCalculation in different contexts. Single responsibility classes are already recommended for reuse like this in OO.

Also when it comes to data validation, OO performs all of this validation too and in a much more compact and code-oriented, extensible way. Why would I write a separate schema when the object itself knows what it will accept, what is optional, and what range the values should be in? I'd imagine the schema and code could become incoherent.

weavejester · 3 years ago
Just imagine you're sending the data across a network, instead of between local functions. If you have a web service that spits out JSON, then you have data that is decoupled from code. That's not to say that the JSON data isn't then read and manipulated by code; just that no specific code is associated with the data.

As for why you'd want to do this, well, one reason is that it makes it easier to bounce data between different services. You don't need to perform any sort of conversion if you're operating directly on the data you're receiving and sending.

The second argument for this style is perhaps more ideological. In the Clojure community in particular, complexity is seen as arising from coupled components. The more things you can decouple, the less complex your codebase. The less complex your codebase, the more reliable and extensible it is.

Edit: another potential advantage is that its easier to use generic functions to interrogate and manipulate data that isn't encapsulated in specific types or objects.

theteapot · 3 years ago
> Just imagine you're sending the data across a network, instead of between local functions. If you have a web service that spits out JSON, then you have data that is decoupled from code. That's not to say that the JSON data isn't then read and manipulated by code; just that no specific code is associated with the data.

That's not really true in classical OO or DOP. There is always code that depends on specifics of some data. In classical OO it's extremely common to de-marshal data straight off the wire and into a class (AKA hydration). From then on the thing that interacts with the data structure directly is the object instance.

skippyboxedhero · 3 years ago
Because the problem in OO is if you have any kind of cross-cutting concern, then it collapses totally.

For example, I have a Wizard object. My wizard has a wand, we store the wand object on our Wizard object. Simple. But then my wizard casts a spell, and does damage to a goblin. Do we put the cast method on the wand, the wizard? There is no real reason to pick one over the other (this problem comes up a lot with game development, which is why this pattern is more common there...Spring is another example, aspect-oriented programming/dependency injection works from similar principles). It is far easier to separate that out totally and have a pure, reusable cast function that takes the wizard, goblin, and weapon.

Another aspect of this problem (which Rust, as an example, makes clear) is that you introduce runtime bugs or hurt performance when you start carrying around a lot of references everywhere. Once you start to think about what actually needs a reference to another object (in Rust, this is limited by the borrow checker) then you realise why OOP doesn't work in some cases.

OO doesn't perform validation, your code performs validation on the data. You can write a separate schema, you can write one schema, but the problem is that OOP tries to fit a round peg in a square hole with some applications.

Very generally, it is harder to make mistakes if you use something like data-oriented. If you have a lot of code with calculations or interactions, it is very pure, easy to test, and fits well with how people think about those elements (one area I have found is financial applications, I actually worked this out and then found out data-oriented program existed when building financial-related stuff). In these cases, introducing OOP means state changing in unpredictable ways (and then someone comes into the project, doesn't understand the abstraction, calls a method that is named erroneously and it all goes wrong).

flarg · 3 years ago
You have a spell object
treis · 3 years ago
>But then my wizard casts a spell, and does damage to a goblin. Do we put the cast method on the wand, the wizard? There is no real reason to pick one over the other

You did pick: "My wizard casts a spell". The cast method goes on the wizard.

dustingetz · 3 years ago
to use Rich Hickey’s definitions, data is an observation or measurement at a point in time. [:fullname “John Doe” t1] [:first “John” t1] no code needed to see the denormalization. Which came from the symbolic model that was chosen. the code only exists to translate between incompatible data models of a program’s inputs and outputs.
snidane · 3 years ago
> Why would I write a separate schema when the object itself knows what it will accept, what is optional, and what range the values should be in?

Because data is just data and the meaning to it is given at the time of application. If you want to couple validation to the data itself - how do you decide which N of the meanings to validate against?

galaxyLogic · 3 years ago
No, data must have meaning, else it is meaningless.

If you want to process the data you must use some language to access parts of the data. Data must have a symbolic representation, not juts be 1s and zeros. Or it can be a bit-stream, but even then you need a language that knows the different between 1 and 0.

person.name

extracts the field 'name' from the data. To manipulate the data, the program must know that there is such a field as 'name' it can ask for.

oivey · 3 years ago
OO schemas are very strict and in many situations difficult to extend. For example, let's say you have a class named Name that contains firstName and lastName. Let's say that you have a function that consumes lists of Names. Let's say you have yet another class called OtherName that contains firstName and lastName. That class will not be compatible with the function. Usual OOP suggests you solve this via inheritance, but if you don't own Name or OtherName that won't help you. OOP's tools for polymorphism are very limited, especially if you don't own all the code you're trying to use (third party libraries). If the "schema" enforced by the type system didn't include the name of the object that opens up a lot of possibilities.
jcelerier · 3 years ago
> OO schemas are very strict

I mean, it's the whole point. You want to have something that will give you compile errors whenever you change anything to make sure that you go all over the cases where the change has an impact

crabmusket · 3 years ago
This is the unfortunate side effect of the Javafication of OOP. Smalltalk doesn't have that problem. Neither does TypeScript, but it has taken us this long to start undoing the Javafication process.
ajuc · 3 years ago
I think in Data Oriented Programming it's fine for your code to depend on data structure. You want to separate code and data, not to decouple them.

As for why see for example Command Query Separation (the data oriented way) vs Tell Don't Ask (the encapsulate everything way).

drpyser22 · 3 years ago
For validation, this approach would have you write a set of functions to validate the properties of the data.

Nothing forbids a function that applies validation to inputs before returning a data object? Extensibility can be done through functional means(e.g. higher order functions, function composition, lens) or oop(strategy pattern and equivalents, code object composition and inheritance,...). Not sure what you mean by more compact and code-oriented?

Code is always coupled to an interface, implicitly or explicitly. In the case of oop, code is coupled to the class, which can represent something specific with very concrete semantics (e.g. employee, author) or something generic that is meant to be subclassed(e.g. person).

randcraw · 3 years ago
These examples spring to mind: 1) high performance computing (vector processing / SIMD), 2) deep neural nets, 3) graphics. Each of these computation models process a small number of large blocks of data whose efficient movement is just as important as their efficient number crunching. OOP doesn't serve those emphases as well as DOP does.
alphanumeric0 · 3 years ago
Thank you for all of the different viewpoints, it's starting to make sense now. I've used JSON schema before in one of my previous projects. I'll keep it in mind for next time.
roenxi · 3 years ago
Code is itself data, so a full decoupling is logically impossible.

Data is going to have an implicit schema regardless because that is just how data works. And once there is a schema, it may as well be expressed explicitly independently of the code because then you get the whole basket of standard schema operations for free (validate the data against a schema, provide a schema to an external consumer when moving data around, talking/operating more generally on schema to manipulate data, generating glue code or APIs programmatically).

Your description sounds like you are using your objects as schema references which is fine, but if there is a 1:1 correspondence with schema then you are already doing data oriented programming, and if there isn't then you can't have 3rd party libraries that support schema-based operations. And losing those schema-based operations hasn't gained anything because the data still has a schema, it just isn't well organised.

TLDR; Data oriented programming isn't essential. But if you plan on passing data around between systems schema should be mandatory, and if you pass data around within a system schema are recommended.

> Why would I write a separate schema when the object itself knows what it will accept, what is optional, and what range the values should be in?

In practice, I have seen a fair number of complex objects where that information is obscure. If there isn't an explicit schema there is a chance of bugs where the object doesn't understand the data it is ingesting and that it won't share its knowledge that there is a problem until it fails in some obscure way in runtime. It wastes a lot of time fixing those bugs because the easiest way to clean that up is to tease out an explicit schema & start thoroughly validating inputs.

deltaonefour · 3 years ago
> Code is itself data, so a full decoupling is logically impossible.

Nah, code exists in a different universe then "data" and it's decoupled by default. Runtime data is completely unaware of "code." that is unless you do something called "reflection" which is sort of a rarely used feature.

deltaonefour · 3 years ago
If you have a hard time thinking this way then you really need to try other paradigms in coding because OOP is not the only way and it is getting less and less popular.

For example C, is not OOP. Linus Torvalds hates OOP, so linux is in written in C. Go was created by Robert Pike who's also subtly against it, and it shows in the language. Additionally Rust pretty much gets rid of objects as well. React is also moving away from class based representation of components.

These are just modern languages that are moving away from OOP. In addition to this... behind the modern languages there's a whole universe and history of other styles of programming.

Not against OOP... But I'm saying it's not a good sign if OOP is the only perspective you're capable of seeing.

discreteevent · 3 years ago
I think all of your examples allow encapsulation of data behind polymorphic interfaces. So, not data oriented. This includes the Linux kernel [1]

https://www.cs.cmu.edu/%7Ealdrich/papers/objects-essay.pdf

g9yuayon · 3 years ago
Reading the discussion here, I can't help but thinking that people are defending their own philosophies: OOP vs FP vs DOP vs etc. I wish the author had killer applications or killer examples in different categories, like can I code an operating system easier, can I code a database easier, can I create a complex streaming job easier, can I write a library as complex as Apache BEAM easier, can I write a compiler easier, can I create a web framework easier, can I write a JSON parser easier, you get the idea. Or maybe examples that contrast existing solutions: how do I use DOP to write a better RxJava? how do I use DOP to write a better SqlLite? How do I use DOP to write a better graph library? How do I use DOP to write a better tensor library? how to do use DOP to write a better Time/Date library? You know, something that's so compelling and so obvious.
ozim · 3 years ago
I have to agree here - there is total disconnect from context in these discussions.

I am writing business line applications - I don't have much need for "generic" functions like outlined in the article. My framework/language provides for example generic .Sum() I could use if I implement specific interface.

But usually I have to make specific sum and put it in database or in the interface.

Like I need to sum age or sum prices or sum amount of items in inventory - and I have to show these in the interface. I think it is quite BS to say there can be "generic" data structure and "generic" functions in context of business line application.

Other stuff I was doing was warehouse automation system and if I had X,Y,Z coordinates I had these in generic data structure named Coordinates - but any function that was going to do anything with coordinates had to be implemented in the context of machine. For example lift should never operate on X cooridinate I could calculate distances - but then there was never use case to calculate distance between machines because these had static access points and one would calculate distances to these access points only.

throwaway894345 · 3 years ago
You'll have to define "OOP" first. Everyone thinks its defined, but even among OOP proponents there isn't consensus ("it's about message passing", "it's about encapsulation", "it's about inheritance", "it's about dot-method syntax", etc).
g9yuayon · 3 years ago
My point is that the author needs to make it clear what exactly DOP can do better for. OOP, whatever how that is defined, is just an example of what people discussed under the OP.
deltaonefour · 3 years ago
One way to define it is to look at the thing that's unique to OOP that isn't used in any other paradigm.

In OOP data and method are unionized into primitives that form the basic building blocks of your program.

This is unique to OOP. It must be the definition. Defining it in terms of message passing, encapsulation and inheritance are not as good because these concepts are used in other paradigms as well.

viebel · 3 years ago
DOP is a good fit for building information systems
osigurdson · 3 years ago
>> Take, for example, AuthorData, a class that represents an author entity made of three fields: firstName, lastName, and books. Suppose that you want to add a field called fullName with the full name of the author. If we fail to adhere to Principle #2, a new class AuthorDataWithFullName must be defined

Wait, what? Just add another field/property to the existing class. It's a silly example anyway as normally you would just add a function to concatenate the two strings.

The stated advantage is to be able to add the new property "on the fly". I suppose this means without changing the code. It does beg the question "what can existing code possibly do with this" (other than display it in a generic way or count the number of fields)? Furthermore, adding something new is rarely much of a problem as it is a non-breaking change. A more difficult example would be removing the "firstName" field. Assessing the impact of such a change in a large code base would be extremely difficult. Get good a grep and hope that the test suite is comprehensive.

brunooliv · 3 years ago
Truly misguided article as most of the things by the author, unfortunately. I was so excited about the book "Data-oriented programming" when it was first being released...it was so heavily publicized as well that it was constantly in my face which likely pushed me over the edge to give it a shot and buy it.

Unfortunately, not all that glitters is gold. It feels extremely beginner oriented, only touched basic concepts taught at uni level and it shows a huge disconnect between the theory and the real world work of a developer leveraging data in any way, shape or form.

Don't buy the book, it's so not worth it.

blain_the_train · 3 years ago
how did it show a disconnect?
brunooliv · 3 years ago
Essentially, by pushing the discussed topics as the One True Way and almost "purposedly" choosing to not discuss trade-offs and talking about alternatives or disadvantages. In the real world, trade-offs are very important
bo0O0od · 3 years ago
Awful stuff. These principles could only ever make sense in a dynamic language since it's mostly manually enforcing some of the basic functionality of a type system, but the fact that he also tries to argue this style could be used in a language like C# throws that defense out the window.

https://blog.klipse.tech/databook/2022/06/22/generic-data-st...

The examples also contradict his other principles, i.e. immutability.

weavejester · 3 years ago
It's not impossible to type check heterogeneous maps at compile time, but most static type systems don't support this. I think you'd certainly see much more friction trying to program like this in C# than you would in Clojure.
jayd16 · 3 years ago
C# has type safe anonymous types.

    //Anonymous type with integer property
    var foo = new {Number = 1};
    //Compile time error if you try to assign a the integer property to a string
    string s = foo.Number;
An object is just a fancy map, after all... These are also immutable by default, which probably makes them even more relevant to this DOA discussion.

zasdffaa · 3 years ago
> It's not impossible to type check heterogeneous maps at compile time, but most static type systems don't support this

I guess you mean dependent types[1], but if you don't, I'd appreciate an elaboration. If you do mean DTs, how might it look for a hetero collection?

[1] If anybody has any good intros to dependent typing in C#, that'd be much appreciated. A web search throws up some pretty intimidating stuff.

bo0O0od · 3 years ago
I agree, but I also think if they author either knew what they were talking about or wasn't just trying to sell more books they'd make that clear. Rather than just trying make the case for these patterns in languages that don't suit them.

Deleted Comment

jayd16 · 3 years ago
Kinda funny to say seeing as C# actually has a dynamic type.

Even if you don't use that, you could certainly orient your data as "structs of arrays instead" of "arrays of structs" (so to speak). It's fairly common in games.

agrafix · 3 years ago
you can type check this in static languages too if the type system supports structural typing [0]

[0] https://en.wikipedia.org/wiki/Structural_type_system

mtVessel · 3 years ago
There's something called "Data-Oriented Programming" and something else called "Data-Oriented Design". I can never remember which is which. This post changes nothing.
GolDDranks · 3 years ago
I think the abbreviated forms of the paradigm are often more "stable" than the full names, because people keep hand-waving the names and thus mixing them up. For me, "DOD" is the thing where you are very performance-oriented and you have flat, cache friendly arrays of data and affinity to ECS (entity-component-system) stuff etc. This is clearly not it, but eyeballing it, the ideas seem somewhat compatible with it. (Except the immutability part.)
ArrayBoundCheck · 3 years ago
One is ridiculous and uses immutable data. The other is made famous by a guy in a Hawaiin shirt and preforms well. Designer on vacation might help you to remember which is which
Tomis02 · 3 years ago
Indeed. The guy in Hawaiian shirt writes simple, fast code with the minimum amount of abstractions, and therefore can afford to finish work early and go to the beach. Everyone else is working overtimes untangling a mess of objects, principles and hierarchies.
osigurdson · 3 years ago
>> static boolean isProlific (Map<String, Object> data) { >> return (int)data.get("books") > 100; >> }

Could anything be more confusing with a large code base? Also, lots of nice key not found and invalid cast exception errors to debug with this approach. Sometimes boxing makes a material difference to performance as well.