Readit News logoReadit News
gumby · 6 years ago
I'm glad this hasn't turned into (so far) the usual "c++ is dumb" flame fest.

I've really enjoyed programming in c++17 for the last three years (I had the luxury of starting from an empty buffer) and find the language pretty expressive. If you are able to use it as a "new" language, ignoring its C roots and features that exist pretty much just for back compatibility, it's really quite expressive, powerful, and orthogonal.

I'm actually pretty glad the c++ committee has been willing to acknowledge and deprecate mistakes (e.g. auto_ptr), to squeeze out special cases (e.g. comma in subscripting) and to attempt to maintain generality. Conservatism on things like the graphics standard is reducing the chance of the auto_ptr or std::map mistakes.

tsimionescu · 6 years ago
> it's really quite expressive, powerful, and orthogonal

Expressive and powerful are definitely true, but I'm not so sure about 'orthogonal'.

Most C++ features seem to depend heavily on each other, and library decisions can impose restrictions/boilerplate on application code in many areas.

Especially as the language has evolved, features like exceptions, destructors, copy and move constructors, and operator overloading have become more and more indispensable. Template functions are still required for any kind of variance (e.g. you still can't have a non-template function that works on a const std::shared_ptr<const *base_type> or any derived type).

None of these is unbearable, but I do believe that even the modern subset of C++ is still the least orthogonal mainstream language in use (and definitely the most complicated by far). To be fair, it's also likely the most expressive mainstream language too, and still the most performant.

mike_hock · 6 years ago
> you still can't have a non-template function that works on a const std::shared_ptr<const *base_type> or any derived type

I guess you meant, you can't have a function that accepts a shared_ptr to the base type without indirectly using template machinery?

Because you can most definitely have a function that does this that is not a function template.

    #include <memory>
    #include <iostream>
    
    
    struct foo {
     virtual ~foo() = default;
     virtual void bark() const { std::cout << "fork\n"; }
    };
    
    
    struct bar : foo {
     void bark() const override { std::cout << "bark\n"; }
    };
    
    
    void poke_so_it_barks(std::shared_ptr<foo const> const thing) {
     thing->bark();
    }
    
    
    int main() {
     poke_so_it_barks(std::make_shared<foo>());
     poke_so_it_barks(std::make_shared<bar>());
    }

beetwenty · 6 years ago
The thing that has really kept me from getting behind updates to the C++ universe is the lack of progress on improving the state of build tooling. It is miserably underengineered for modern, dependency-heavy environments. C++20 does introduce modules, which is a good push towards correcting the problem, but I'm still going to be "wait and see" on whether the actual implementation pans out.
gumby · 6 years ago
Well, there's Conan, which helps a bit, but these days what I simply do is use CMake to download a package from GitHub or wherever and build it.

Sadly the C++ ABIs are standardized the way that C ABIs are (I'm OK with why but it's unfortunate in that it creates a barrier) so you have to have separate libraries compiled with g++ and clang++ if you use both on your platform (we use both because they catch different bugs, and for that matter exhibit different bugs). But it means you can't simply install, say, fmt in any system-wide directory like /usr/lib or /usr/local/lib

Just as an amusing side note: Common Lisp used to be criticized for the massive size of its libraries and later likewise C++. It was true they were quite large. Now both are criticized for their tiny libraries. Which by today's standards they are.

umanwizard · 6 years ago
This is the #1 thing I love about C++ compared with Rust — I don’t want it to be easy to depend on thousands of things. I would rather use a small, curated, relatively standard set of libraries provided by my OS vendor or trusted third parties.

“Modern, dependency-heavy environments” are a symptom of the fact that certain ecosystems make it easy to get addicted to dependencies; I don’t think they’re a goal to strive towards.

davrosthedalek · 6 years ago
I'm not entirely convinced that this is a bad thing. The dependency-heavy environments a la Node.js gave us some interesting security nightmares.
de_watcher · 6 years ago
OS distributions have package managers and package repositories that have maintainers who are mostly decoupled from the developers. So that takes care of the quality/security problems that arise in ecosystems like Node.js.

There is also C. The tooling and the "package manager for C++" would be expected to seamlessly work for C and be expected to be accepted and used by the C community.

(personally I use CMake + OS package manager)

Fronzie · 6 years ago
Although I agree with your point, cmake+vcpkg goes a long way for hobby projects, cmake with a bit of custom scripting goes a long way for larger scale projects.

The cmake language might not be beautiful, but it does allow for simple sub-projects/library scripts once the main cmake script is set up properly.

geokon · 6 years ago
I think the general thinking is that in the c/c++ world dependency management is the role of the build tool

This is currently really easy to do cleanly though CMake by using Hunter and toolchain files (don't use submodules or addexternalproject)

External tools like Conan are also unnecessary bc they introduce redundancy and extra maintenance

madhadron · 6 years ago
> build tooling...is miserably underengineered for modern, dependency-heavy environments.

My advice about build tooling is to use buck (from Facebook) or bazel (from Google). If you have an ex-Googler nearby, use Bazel. If you have an ex-Facebooker nearby, use buck. Otherwise flip a coin.

ronmex · 6 years ago
Look into Bazel, so far it has been a dream.
lonelappde · 6 years ago
What's the specific problem? Big companies run 100K+ .so/.o systems.
seisvelas · 6 years ago
What does orthogonal mean as an adjective for a programming language? I've never heard that word outside of math and I always get excited by a word that might expand my ability to think about code.
tomlu · 6 years ago
Every feature controls one "dimension" of a language, and can be freely combined with other features that control other "dimensions" without worrying about how they interact.

I am quoting the word "dimension", because applying a precisely defined mathematics concept to something messier (like programming) will require a certain amount of hand waving and squinting.

jejones3141 · 6 years ago
I first came across the notion of orthogonality for programming languages in Algol 68.

Perhaps it's easiest to give examples of what it's not. At first in C you could only assign scalars; to assign a struct or union you had to call memcpy() or something like it. Eventually you could assign aggregates, but you still can't assign arrays unless you hide them in structs. It took a while before you could pass or return an aggregate.

Exceptions of that sort are non-orthogonalities.

gugagore · 6 years ago
One place where this shows in a fairly technical way is in discussing instruction sets: https://en.wikipedia.org/wiki/Orthogonal_instruction_set

"In computer engineering, an orthogonal instruction set is an instruction set architecture where all instruction types can use all addressing modes. It is "orthogonal" in the sense that the instruction type and the addressing mode vary independently. An orthogonal instruction set does not impose a limitation that requires a certain instruction to use a specific register[1] so there is little overlapping of instruction functionality.[2]"

Its use here is more general, but I hope the specific example helps.

shanemhansen · 6 years ago
Honestly I think it's a silly word to use for features. In math orthogonal means roughly "at right angles" or "dot product equals zero". When two things are orthogonal it means that they are unrelated so you can deal with them independently which makes things simpler.

In describing features orthogonal is basically jargon for unrelated and adds no additional clarity imo.

paulddraper · 6 years ago
Orthogonal means that aspect are independent of other aspects.

I.e. you can understand one aspect of a programming language (memory allocation, generic programming, standard library, mutability, control flow, pattern matching, etc.) without understand every aspect.

As a counterexample, if a language has generics, but only generics for a standard array type, that lacks orthogonality.

lone_haxx0r · 6 years ago
Imagine a programming language that gives you three different ways to add 2 numbers:

1 + 2;

std::add(1, 2);

std::int(1).add(std::int(2));

That's is not orthogonal at all, since we have 3 different ways to do the same thing for no discernible reason.

An orthogonal interface is one where there are fewer reasonable ways to do something, and each function/operation has it's role clearly defined, it doesn't try to do things unrelated to its main role, and the interface doesn't mix different layers of abstraction (like intertwining OOP and arithmetic in my 3rd example, when there already are native arithmetic operations).

peheje · 6 years ago
I recommend the book: The pragmatic programmer. There's a section there explaining orthogonality by explaining a system that is not: a helicopter.

That example sticks with me.

sorokod · 6 years ago
Not the OP but guessing that in this context probably means independent of historical baggage.
fnord77 · 6 years ago
last time I used C++ was in the mid-2000s. With all the new features since then, I assume the "proper" way/style to right C++ has changed since then?
MauranKilom · 6 years ago
Largest changes (most from C++11, some from C++14 and C++17) are:

- std::unique_ptr and std::shared_ptr (your code should not have any new/delete). It's easy to write code with clearly defined ownership semantics, and I've yet to create a memory leak.

- Proper memory model so that multithreading has defined semantics.

- Lots of syntactic sugar (auto, range-for, lambdas, structured bindings, if constexpr, etc.).

- More metaprogramming tools/fun.

C++11 alone is a very different world compared to what you are probably used to.

gumby · 6 years ago
I don't know if you mean formatting, in which case I don't believe anything has changed (pretty much anything goes) but yes, you can now do for (auto [x, y] : list_of_coordinates()) ... and things like that.
gameswithgo · 6 years ago
* is now deprecated.
dotancohen · 6 years ago

  > I had the luxury of starting from an empty buffer
Let's assume that I'm willing to flush the cache. Are there and C++17 learning resources that you might recommend? My day-to-day languages are Python and scripting languages.

zelly · 6 years ago
Cppreference (learn by doing and look stuff up as you go)

https://cppreference.com

The ISO C++ Standard draft (unironically—it's actually quite readable and is actually up-to-date unlike a lot of books)

https://wg21.link/std

Tour of C++ (book) by Bjarne Stroustrup (start here, you can read it in one weekend)

gumby · 6 years ago
For me, pls see my comment https://news.ycombinator.com/item?id=21456353

Also the books by the person whose blog spawned this comment thread seem to be pretty good.

dvdhnt · 6 years ago
I have seen comments that suggest you can write cross-platform applications (or at least shared business logic) in c++.

Is that the case? If so, how would you recommend someone go about learning c++?

davrosthedalek · 6 years ago
As in: recompile and works? For sure, see for example a lot of the HEP/NP software which runs typically on Linux, BSDs (incl. MacOS) and I think even Windows.

As in: have the same binary? You could probably use the same object files, but shared libraries and executables probably won't work (without trickery), just because the file format is different.

gumby · 6 years ago
I read Stroustrup's "tour of C++" then looked specifically for blogs that talked about C++ 17. And wrote a lot of code with all warnings enabled.

Yes you can write cross platform applications though perhaps only the engine, not UI.

DoofusOfDeath · 6 years ago
In commercial settings, I encounter several barriers to using C++ versions newer than 2011:

(1) Most C++ code I encounter is C++11, and I deal with lots of projects. It's rarely sensible to change the language version of a large code base without a very good reason.

(2) Many developers are well-familiar with C++11.

(3) There's no widespread perception that C++14 or later bring compelling language improvements.

(4) Some (most?) C++11 developers are already uneasy with the complexity of C++11, and aren't eager to incur the learning curve and (perhaps) new set of pitfalls associated with newer C++ versions.

(5) Some C++11 developers look at the language trajectory, especially in terms of complexity, and have decided that their next language(s) will be e.g. Rust or Julia instead of C++14, C++17, etc.

I suspect these factors all contribute to the momentum that C++11 seems to have.

jandrewrogers · 6 years ago
I don't encounter these barriers in commercial settings. The biggest barrier is compiler support if you target environments with different compiler toolchains. Each new version of C++ brings features and bug fixes that make code simpler. It actually reduces complexity from the perspective of the developer. The forcing function for upgrading C++ versions in my teams is always that it would greatly simplify some immediate problem or allow us to remove a gross (but necessary) hack due to the limitations of prior versions. At least with the LLVM toolchain, upgrades have been nearly issue-free so there is little to recommend not doing it. C++17 is a big upgrade over C++11 functionally. Developers that do not see compelling improvements probably weren't using most of the functionality of C++11 anyway.

While the number of features and quirks in standard C++ grows with each version, the subset that a developer needs to remember to write correct idiomatic code is shrinking with each new version as capabilities become more general and the expression of those capabilities simpler. There is a new, simpler, and vastly more capable language evolving out of the old one and you can increasingly live almost entirely within this new language as a developer. I hated classic C++ but I actually really like this new C++, and it allows me to write far less code for the same or better result.

There are still many things that C++ can express simply and efficiently that Rust, Julia, etc cannot. If you work on software that can take advantage of that expressiveness (e.g. databases in my case), the cost of changing languages is quite large.

dgellow · 6 years ago
> There is a new, simpler, and vastly more capable language evolving out of the old one and you can increasingly live almost entirely within this new language as a developer.

How is that true in practice? I would expect that once you have a codebase big enough, instead of living only with the new language you still have to deal with (or at least read) code written in previous versions of the language. And so instead of having to know only the modern subset you would have to keep in mind all the potential quirks from previous versions.

brylie · 6 years ago
How does a developer who is new to C++ learn just the modern idiomatic style? Is there a "C++, the good parts"?
tcbawo · 6 years ago
This has been my experience also. Of course you can't forget the old stuff if you're maintaining an old codebase. But, I'm grateful to use lambdas, std algorithm, if constexpr, and fold expressions instead of boilerplate classes, bespoke loops, SFINAE hackery, and recursive variadic functions. There is less wizardry necessary than in the past.
w-m · 6 years ago
On the other hand, there aren't really any changes in 14 and 17 that break compatibility with code written in 11, at least that I'm aware of.

In the examples I've seen it seems to mostly come down to compiler, toolchain and 3rd party binary availability for the new versions, like with any major language version.

If for example you are using stock gcc from Ubuntu 16.04 for your builds, you can't use C++17.

Fronzie · 6 years ago
For big (legacy) code-bases, just upgrading the compiler is a project by itself.

All 3rd party C++ libs need to be upgraded, all internal projects need to upgrade in an orderly fashion.

On top of that, just the change of the compiler will expose some pre-existing bugs.The bugs might have been there, but are not a problem for the business if they currently don't show.

I've seen projects with a core team of half a dozen people, supported by the other SW engineers working for 6 months just to upgrade to a newer version of windows and the compiler. This was mostly due to unwise SW-design decisions accumulated over 15+ years.

gumby · 6 years ago
> On the other hand, there aren't really any changes in 14 and 17 that break compatibility with code written in 11, at least that I'm aware of.

auto_ptr has thankfully be removed, but if you have legacy code that uses it it won't compile under C++17. Of course if you have code that uses auto_ptr you should fix it, but realistically nobody has the time for that.

CoolGuySteve · 6 years ago
ABI stability of the STL is the biggest issue I've seen. Upgrading from C++11 to C++17 becomes a bust if any of your 3rd party libraries want to pass even a basic std::string.

I don't know what the hold up is on modules. Even a basic "this header uses a C++11 namespace" functionality would fix most of the problems.

But even better would be something like "extern 'Cwrapper'" that would let you expose C++ objects as a C API with a void* object pointer without having to do the pointless wrapper dance every time.

Like make the compiler convert "class Thing { doThing(int a); }" to "typedef void* Thing; Thing_doThing(Thing* this, int a);" automatically so that other languages can call me.

gpderetta · 6 years ago
What's the issue with abi stability? GCC broke the Library ABI with C++11 but the old ABI was still available. I thi k it still available in C++14 and 17 mode.

If you switched to the new ABI with c++11 then c++17 as far as I know brings no new breakages.

Ididntdothis · 6 years ago
I don’t understand why they don’t do more about ABI. Compared to C# or java it’s such a pain to share code in C++. To me this would open up a lot of opportunities to create more libraries.
harry8 · 6 years ago
Sounds like a job for an llvm tool. The parse tree is exposed, doing the transform shouldn't be ridiculous? Are you keen enough?
htns · 6 years ago
Re (5), I rather feel that the quality of the documentation and communication around modern C++ puts everything else to shame. If you knew java or python in 2009, how are you supposed to quickly catch up with 2019? (Rust also seems to have piled on a lot of additions since 1.0, but I don't use it so I shouldn't speak.) There are books and blogs for everything popular, but it's really a little shocking to me that no other big language has anything that can compare to cppreference.com.

This doesn't change what people other than me might be thinking, of course.

dgellow · 6 years ago
I guess it's a question of habit. I'm learning C++ during my free time since a few months. I find cppreference.com really difficult to parse. I understand way more of the documentation now than when I started because of the bit of experience I gained, but it is written in a way that makes everything sounds really over-complicated. I remember when I googled for something and arrived at the page about "Value category": https://en.cppreference.com/w/cpp/language/value_category. If you're not already well familiar with the language, this page is really hard to understand IMHO.

I found it way easier to deal with Go, C#, and Ruby documentation for example.

gumby · 6 years ago
I agree that cppreference is amazing and is my only c++ link in my browser favourites. Documentation is typically clear and succinct. And its a wiki!

Not a fan of the spongers that scrape it though.

einpoklum · 6 years ago
(1) It's actually rather sensible, because C++ is famous for its backwards compatibility. So you can use C++14 or C++17 features in some new code while leaving the rest as-is - and gradually refactoring some of it with new features.

(2) This used to be the case with C++03 (or C++98), and it changed...

(3) I partially disagree and partially claim you're begging the question. You don't have poll results to suggest what people think of C++14. But - C++14 had a lot less changes compared to C++11 or C++17; this was natural considering how the cycle of debate of potential features progressed.

The thing is, the fact that C++14 wasn't that revolutionary in terms of features also means it is easier to adopt. If you want more novelty - try C++17; or - try a C++14-dependent library like Eric Niebler's ranges-v3.

(4) C++17 allows for writing less complex code in many cases. Specifically, you can significantly reduce the need for template metaprogramming - a particularly hairy part of the language - by using more constexpr functions and if-constexpr in regular functions. And there are other examples. Like Bjarne says: The C++ committee endeavors to "make simple things simple (to write)". So it's a trade-off.

(5) Some people switch languages because of workplace decisions or personal preferences, that's true; but people also adopt C++ because of what becomes possible, or easily-expressible, with newer versions of the language. For example, using the ranges library and with C++14 or later, functional programming with C++ has become quite reasonable, borderline pleasant even.

---

Bottom line: In my opinion, at this point C++11 has more inertia than momentum. And that's ok. People will progress over time.

pacman128 · 6 years ago
(1) I know of one problem with C++17 and old code. C++17 removed throw specifications from the language (http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2016/p000...) . We use third party libraries that contain these which keep us from using C++17.
nestorD · 6 years ago
(4) is my main reason for missing C++17 when I am on a project stuck with C++11.

I feel that the template metaprogramming framework deserves an overhaul to clean up historical mistakes and keep only the good part.

microtherion · 6 years ago
> There's no widespread perception that C++14 or later bring compelling language improvements.

I would argue the opposite: C++11 is a fairly different paradigm to C++03, and upgrading an existing code base is a sizable project.

But once you get hooked into the concepts it introduces (auto, lambdas, templates, constexpr), each subsequent revision of the language standard introduces some improvement to these concepts that deals with unpleasant edge cases the earlier revision had, so each of them promises to make your existing use cases simpler, at a moderate rewrite expense.

CreRecombinase · 6 years ago
C++11 vs C++14/C++17/C++20 is not python2 vs python3. Because of the emphasis on backwards compatibility, you can often just recompile your C++11 project with the new version of the standard and everything should just work. Backwards compatability also means that the newer versions are only as "complex" as the features you choose to use.
saghm · 6 years ago
The tricky part of this is that if your project is a library, you're not just opting into having to update your own toolchain, but also that of all your users. This means that a lot of library developers will err on the side of not updating their minimum version, which in turn means application developers don't have as much pressure to update either. This isn't specific to C++ by any means, but I think it does pop up a bit more in this space due to the fact that a lot of developers prefer to just use the default toolchain of their system, so forcing downstream users to update their toolchain could be viewed as a bit more of a maintenance burden.
jcranmer · 6 years ago
> (3) There's no widespread perception that C++14 or later bring compelling language improvements.

Constexpr if (introduced in C++17). If you're building a library that uses templated types (and therefore static dispatch instead of virtual functions and dynamic dispatch), then using constexpr if dramatically simplifies the implementation of your code.

saghm · 6 years ago
I don't think GP was arguing that there weren't any compelling language improvements, just that there isn't a "widespread perception" of those improvements.
paulddraper · 6 years ago
> here's no widespread perception that C++14 or later bring compelling language improvements.

C++14 didn't add a whole lot of new things; it mostly improved on existing features.

C++17 adds a number of new things.

I would use it just for std::optional and std::variant though.

dgellow · 6 years ago
Do you think that will change with introduction of modules? Would that be a feature compeling enough to make people move to newer versions of the standard?
zelly · 6 years ago
They will switch to 20 for modules. No. 1 headache is build times. It will literally save money.
SamReidHughes · 6 years ago
C++14’s compelling improvement is initialization in capture lists, so you don’t have to copy into them. I haven’t seen anything compelling beyond that.
lasagnaphil · 6 years ago
Right now my biggest things I like from C++20 are:

- Concepts (Now I can use static polymorphism without all those SFINAE template hacks)

- Designated Initializers (Finally!!! Using POD structs became way more convenient now)

- Coroutines (Would be pretty nice for writing games / interactive applications)

The things I don't have any interest in (but don't care if it's in or not)

- Ranges (Too much complexity for something you could do with plain old for/if loops...)

The things I'm worried about:

- Modules (Theoretically really good, but in practice the whole thing is starting to become a mess, especially when it's interacting with the preprocessor and external libraries, and when trying to preserve backward compatibility.)

jokoon · 6 years ago
I have hope for modules because they can improve build times. Not exactly sure they would improve build time but if it's true, it would be a massive improvement.
saurik · 6 years ago
Meanwhile, I am deeply concerned about modules, as it seems like the kind of feature that is going to massively harm build times by making it much more difficult to do truly independent distributed builds (which right now is pretty easy with C/C++ due to its separate compilation and extremely explicit inter-file dependencies).
Renana · 6 years ago
Have you tried using IncrediBuild. It's a dedicated solution to reduce C++ build times. If you're working on Windows, you can use it free (sharing the link to download): https://www.incredibuild.com/ibonlinestore/register#/regWind...
mangix · 6 years ago
OTOH ranges usually get compiled to faster and more efficient code. Smaller too.
jcelerier · 6 years ago
> OTOH ranges usually get compiled to faster and more efficient code. Smaller too.

than raw loops ? do you have an example ?

ephaeton · 6 years ago
Designated Initializers done awkwardly a.k.a. "the C++ way"

  // Point2D point2d {.y = 2, .x = 1};         // (1) error
Oh my. The C++ committee keeps their flight altitude rivaling moles and worms. This is another brittle feature, just like the initializer lists that have to mirror the declaration order. In what world is this a better option than allowing any order? Can't compiler implementors be bothered or what is the rationale?? C does this better.

hohenheim · 6 years ago
I think you are confused about the intention. If you want order independent initialization you could just as well use the old initializer list. The idea here is that you DON'T want to accidentally initialize x with y.
rosshemsley · 6 years ago
Features I wish C++20 had:

* An opinionated, modern packaging and dependency story (like go modules, Rust crates)

* built-in library support for logging, http, zip, gzip, json, yaml, template rendering, RFC3339 datetime reading/writing

* the dream: compliant compilers must be able to compile down to static binaries, cross compiling built-in.

Features C++20 actually has:

* new fancy spaceship operator that I'll now have to learn and never use...

lone_haxx0r · 6 years ago
> built-in library support for logging, http, zip, gzip, json, yaml, template rendering, RFC3339 datetime reading/writing

That's completely out of scope for a programming language pretending to be taken seriously as a general purpose one (including e.g. systems programming).

Also, your list looks like a pretty arbitrary choice of protocols and formats, why don't we also add support for SSH, SMPT, IMAP, PNG?. In the end, everyone would want their niche protocol/format to be included in the standard.

> the dream: compliant compilers must be able to compile down to static binaries, cross compiling built-in.

Cross-compiling to what architectures specifically? The most used ones are proprietary and I think that disqualifies them from being included in a serious international standard, since you'd need to read the spec and implement it in order to be compliant (I'm not versed in ISO inclusion requirements, so this may be already happening, but it would still be wrong. (grepping "x86" in the offical C standard returns no results though.)).

zelly · 6 years ago
> built-in library support for logging, http, zip, gzip, json, yaml, template rendering, RFC3339 datetime reading/writing

There is a Boost library for each of those things.

> An opinionated, modern packaging and dependency story (like go modules, Rust crates)

> the dream: compliant compilers must be able to compile down to static binaries, cross compiling built-in.

You can use Conan or vcpkg. LLVM basically solves the cross-compiling issue since you can take the IR to any platform that has a LLVM target.

Neither of these are feasible to include in the International Standard because C++ runs on more than amd64 and would make a lot of obscure platforms no longer compliant with the standard. Rust crates are nice, but people building medical devices with C++ shouldn't need to worry about supporting an npm-like monstrosity in their builds.

steveklabnik · 6 years ago
LLVM IR is not platform independent.

Rust compiles on far more platforms than amd64, and Cargo works just fine.

Nothing forces you to depend on any packages you don’t want to.

cjhanks · 6 years ago
C++ has many audiences with disparate needs; ranging from firmware, to scientific computing, to application development.

Most of the industries using the language optimize their code for a target platform. And general non-optimized portable binaries are simply not cost effective, so standardizing installation has not been simple.

If you do care about portability, you functionally need to. - ensure you're not linking to anything but LIBC and the VDSO

- compile with no assembly language

- disable most optimizations, and select a sufficiently generic hardware target. For the most part, they succeed in features.

- directly compile all 3rd party source in your binary.

Then, your binary should last a long time.

That all said, I believe your needs are in the minority in the C++ community, and other languages likely better suit your needs.

nice_byte · 6 years ago
i never understood the need for package managers. just check your dependencies in and build with the rest of your source code. this doesn't introduce extra moving parts into the system (why should i need an internet connection to build my stuff?), easy to patch and debug dependencies, gives you full control and doesn't restrict your toolchain (e.g. conan doesn't as of now support vs2019+llvm). what i _would_ like to see though is a standardized build system for c++ projects, although that will likely never happen.
zelly · 6 years ago
This is what most companies do irl for security reasons. It's also the easiest, most obvious solution with the least moving parts.

There will never be a standard build system, but CMake is the de facto standard build generator and works well enough.

gumby · 6 years ago
On thing I appreciate is mdspan -- multidimensional projections onto a regular linear vector. I've always had to spin these myself which means they don't fully participate as ordinary containers (who's going to go tho all that work for their own application code).

I'm hoping I can map them into GPU space -- looks like the standard has adequate control for this.

alexhutcheson · 6 years ago
Does Eigen not work for your use cases?
gumby · 6 years ago
That's a good call for a great library.

It's often overkill for what I do, especially as it can't use the same compiler options as my tree normally does (they have to bend over backwards for back compatibility; I don't). I do use it some places. I certainly don't envy the hard work of library developers!

When I simply need a 2D or 3D resizable array it's often easier, as I mentioned, to simply spin one up. Having that support "native" will be great.

zelly · 6 years ago
mdspan didn't make it in 20
gumby · 6 years ago
Rats!
afranchuk · 6 years ago
I've been anxiously awaiting modules and concepts. Both are incredibly important to reduce developer burden and improve the development experience.

Modules: While it is important to be able to fine-tune linking and compilation in some settings, in most, and especially for beginners, this should be handled by the compiler. Especially when compared to other modern languages, C++ is much more complex to understand the toolchain and what's going on under the covers with the dichotomy of compilation and linking. Header files were a hack that have been around for too long, and there needs to be less separation between the interface and actual binary code when compiling. This is a headache both for novices and the experienced.

Concepts: The missing link for templates. Besides removing some of the roundabout-ism of SFINAE by providing compile-time verifiable interfaces, I think the biggest benefit of concepts will be the error messages. Right now, the state of errors in heavily templated code is abysmal, and this translates to a lot of wasted time for developers. Concepts should allow the errors to indicate exactly what's wrong.

I can't wait to be able to use these in public code. Some compilers support concepts with a flag already (and GCC 10 has it without the special flag), though none support modules yet...

dgellow · 6 years ago
Could someone describes the keyword "unlikely" and "likely" in a bit more details? It seem to be a very niche thing to add to the language. Is it expected that those hints will have an impact important enough regarding what the optimizer can do?
johannes1234321 · 6 years ago
likely and unlikely exist in different forms as compiler extensions for a while. There is code where they indeed help compilers to produce slightly better code for the hot path. Good use typically involves lots of measurement as developers are often wrong about the impact and about what the real flow is.

What kind of things can the compiler improve? For instance it can arrange the code in a way that the instructions of the hot path are directly behind each other, whereas the unlikely case jumps further away. Or it can arrange code a bit to help the branch predictor.

usefulcat · 6 years ago
gcc and clang (at least, I don't know about msvc) have had these for a while. Basically, the instructions for whichever branch is declared most likely to be taken will be placed immediately after the comparison instruction(s), which is generally more cache-friendly.

It seems conceivable that it might also affect the compiler's inlining choices in each branch (i.e. don't bother to inline as aggressively in the less likely branch, to reduce code size), though I don't know for sure.

jnordwick · 6 years ago
Intel and AMD don't hasn't any hinting instructions. Mostly it will affect inlining and well move unlikely code away so the likely path in straight.

But is can already figure that out often especially if it sees exceptions.

Better is to just use the profile guided optimization in the compiler.

ludamad · 6 years ago
A common way to extend a language is to standardize what people are doing in practice. It seems niche, but you have to remember what C++'s niche is. Linux kernel etc use hints like this extensively
the_af · 6 years ago
I thought the Linux kernel was C and not C++. Has Linus changed his opinion of C++ in all these years?
bluGill · 6 years ago
In some niches it is important.

Say you are writing a high speed trading program. At some point you get to the "if(wouldMakeMoneyOnTrade())...". However for every time this passes and you trade there are 1000 where it fails and you don't. However time is critical because your competitors are using similar algorithms and so you need to beat them - thus you put in a likely and ensure that the CPU jumps to the trade code the fastest possible. Your competitors that use Java pay a CPU branch prediction penalty here because the hotspot optimizer has noticed that the if almost always fails and optimized for the false case. Thus you can make money by beating your competitor to the trade. (Note, this doesn't mean you can't use Java for your code, but if you do you need to manage that risk somehow)

davrosthedalek · 6 years ago
If you don't use FPGAs, you probably lost that game anyway.
saagarjha · 6 years ago
A stop-the-world GC for HFT doesn't sound like it would work all the well…
wffurr · 6 years ago
dgellow · 6 years ago
Interesting. From the comment in abseil's implementation:

// Compilers can use the information that a certain branch is not likely to be

// taken (for instance, a CHECK failure) to optimize for the common case in

// the absence of better information (ie. compiling gcc with `-fprofile-arcs`).

//

// Recommendation: Modern CPUs dynamically predict branch execution paths,

// typically with accuracy greater than 97%. As a result, annotating every

// branch in a codebase is likely counterproductive; however, annotating

// specific branches that are both hot and consistently mispredicted is likely

// to yield performance improvements.

zarkov99 · 6 years ago
Without actually looking at the docs, I would guess these map directly to cpu intrisics that can be used to guide the cpu branch prediction algorithms.
harry8 · 6 years ago
Which cpu instructions guide branch prediction algorithms? Which cpus?
beezle · 6 years ago
Been away from C++ a long time. If picking it up again, is it better to just start with C++17 or 20 or stick to C++11 which seems to be the defacto production version?
gumby · 6 years ago
Depends on your needs. If you don't have a lot of or any legacy code yes, go straight to c++17 (or even 20 though you won't be able to use many of the features right out of a the gate, or without third-party libraries). If you can do this you'll be glad you did.

I was able to start a project in c++17 in 2016, though at the time I had to use boost::variant with the Xcode compiler and boost::filesystem for all three compilers (only a limited use so I could even have skipped it). Also for some complex third party libraries I had to make a little trampoline file that compiled in C++14, or with special flags, and then call those entry points.

Since I was starting from a blank slate I also compiled everything with pretty much maximal warnings an -Werror which really improved the code a lot, something you can never get away with with legacy code (not to insult working legacy codebases!)

kllrnohj · 6 years ago
C++11 was the major version bump. It changed how you're supposed to write C++. C++14 & 17 are just minor fixes & improvements. So there's really no reason to not start with at least C++17.