Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While pleasant-looking, I'm not seeing any "close to the metal" features. C (with its extensions and standard library) has a stronghold in the embedded world because it supports:

* specified structure layout (e.g. bitfields)

* memory layout awareness (e.g. alignment & packing)

* memory ordering awareness (e.g. memory fencing)

* integration with processor intrinsics (e.g. SIMD)

Don't show me a "Hello, world!" example; show me a highly optimized lock-free single-writer single-reader queue. Show me code to decode/encode network protocols. Show me how to access MMIO.

As an embedded developer, I find C simultaneously not high-level enough and not low-level enough. What I would want to see in a language replacing C is at least:

* decoupling of data types from storage size from modular arithmetic (still allowing all to be specified)

* decoupling of logical structure layout from physical structure layout (allowing both to be specified)

* decoupling of on-the-wire layout from in-memory layout

* more expressive memory ordering/visibility constraints

* hygienic generics (Myrddin gets points for this, C++ does not)

* a proper module system (like OCaml's)

* more expressive means of hinting optimizations (such as when to make stack frames, spill registers, unroll loops, etc.)

That is, a new language needs to expand in both directions – higher- and lower-level – to replace C for "close to the metal" work. Just higher-level, like Myrddin and kin (OCaml, Rust, etc.), won't cut it.



Patches accepted -- The language is young, and I have certainly not added all the features that I want. For example, generics should be able to provide enough runtime information that I could do something like:

      pack = std.packbits(some_struct)
and get efficiently packed values.

Memory ordering, visibility, etc -- I'd love to have that added. I haven't figured out what exactly I want that to look like. Give me ideas, and I may very well implement them.

SIMD is just a matter of finding time, I think. I haven't put much work into making exposing intrinsics yet because, let's face it, the generated code is slow right now, so optimization should probably start there. General usability (eg, DWARF output, profiling, more sanity checks) is also a priority.

And I'd like to be self hosting before I do too much feature growth.


    char buf[2048]; /* if you hit this limit, shoot yourself */
Sorry. I stopped reading the code at that point.


Heh. The entire formatting code needs to be rewritten.

It works well enough for now, but it's a stopgap until I manage to get runtime types and pluggable formatters that can be used in places other than writing directly to an FD. This code is good enough for debugging and simple output to the user's command line, but it's extremely crude and limited. It also interacts poorly with type inference, since the compiler figures out many types for you, and it's hard to know what format specifier to put. Combine that with zero type checking on format args, and you get lots of corrupted output.

A limited buffer size is the least of that code's problems.

In short, it's certainly not final, and I'm certainly not satisfied with it as it stands. As for fixing it: Long term, I should be able to do

     std.put("% % %", "string", 123, 'c')
and get sane output, but it still needs compiler work before I get enough type iformation there. I should also be able to plug it over, eg, a buffered I/O file, and have it write bytes to the stream and flush the file. I've punted on fixing that, though, until I add compiler support for runtime types.

(Syntax I have in mind: %{options}, where {options} is optional, and gets passed to the format plugin as a set of flags)


My concern isn't the hardcoded buffer size.

(If it's still not clear: I, and I suspect others, don't appreciate being insulted and threatened. Even in code comments. Even if it's meant "in jest".)


Heh. Compared to some things I've seen in a number of codebases I've poked at, it's positively tame and cheerful. I've learned a huge number of creative insults from code.

In any case, removed.


Appreciated, thanks.


In technical terms, I can understand temporarily using semi-broken code for the sake of bootstrapping.

In terms of the comment itself, I really wish people would avoid violent or otherwise inappropriate language in their projects. A good rule of thumb is to not say it if you wouldn't say it to the President's children while in public.


Agreed.

Even beside violent comments, avoid disrespectful comments in general. If you wouldn't say it in person to someone who's working on the project with you, don't say it in a comment.

(I'm sure there are coders who are fine with saying disrespectful things in person to collaborators. I choose not to work with those people.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: