@faithisleaping @andrewt @danirabbit same person who shipped #embed but apparently our community applies the code of conduct selectively and not intersectionally. i don't think we're ready to "replace C" if we can't even replace dtolnay
@jk i‘m using it and i love it actually
you can embed two files in a row by leaving the suffix directive off the first one and putting a comma! this rules!
anyway. psa. when you use the new and exciting #embed directive for the first time to put a file in your favorite static const char array, remember to use 'suffix(, 0)' or your excitement will turn into twenty minutes of confusion
#define would extract the identifier & parse the following optional argument list & body removing (escaped) newlines to load into a "macro" table.
Non-preprocessor lines would be scanned for these macros' identifiers to perform a find & replace, recursing to handle substitute in parameters.
2/3?
#define would extract the identifier & parse the following optional argument list & body removing (escaped) newlines to load into a "macro" table.
Non-preprocessor lines would be scanned for these macros' identifiers to perform a find & replace, recursing to handle substitute in parameters.
2/3?
@josh @sophiajt don't know how the language will be able to thread the needle of offering no form of metaprogramming let alone compiler plugin interface while it also remains unable to allocate enough resources to the proposals that do undergo the RFC process to get them out of nightly.
even the linux kernel is setting RUST_BOOTSTRAP=1 for the allocator API, which i have personally implemented for both smallvec and indexmap and concluded that due to limitations on cfg(...) bounds would require completely duplicating most of the code to still work on stable (so not upstreamed).
i then spent over a week recently on a proc macro which would quite literally just reproduce the entire content of an impl block in order to overcome the language's decision not to allow attributes upon individual elements when applying generic parameters with T<A, B, ...> the way you can to the individual value arguments of a function call.
but after realizing i'd missed several distinct categories of recursive AST modifications, then decided it was ridiculous to be reimplementing not just a parser but also a large enough subset of its typechecking semantics in order to make it safe enough for crates like smallvec to preserve the expectations around their unsafe calls. i could propose an expansion to the parser to allow attributes on type bounds, but parser changes are far more intrusive than adding a type to the standard library. in the meantime, c++ has not only successfully deprecated an element in their own std::allocator but completely removed it.
every other type of library in rust has to some conform to some form of versioning, except unstable features in the stdlib. incredibly, c++ has also solved this now, because it has a standards process, while the one "spec" rust has produced is strictly for their own compiler's output, while gccrs devs report the type signatures of instrinsic methods they need to implement changes over time (they said a T argument became a u32) in ways that essentially reduce to delegating to LLVM's behavior in practice.
and finally, when the incredible engineer who gave the world #embed spent their time developing a careful backwards-compatible proposal for metaprogramming for us https://mastodon.social/@tedmielczarek/114101345156483840, his rustconf keynote was instead cancelled for reasons no one will explain but led him to avoid any further interactions with our community.
so there appears to be some strange force in the rust community who can stop anything from moving forward while remaining invisible and unaccountable, and i think this points a severe governance issue that might be worth more scrutiny so rust can continue to become what it could be.
@josh @sophiajt don't know how the language will be able to thread the needle of offering no form of metaprogramming let alone compiler plugin interface while it also remains unable to allocate enough resources to the proposals that do undergo the RFC process to get them out of nightly.
even the linux kernel is setting RUST_BOOTSTRAP=1 for the allocator API, which i have personally implemented for both smallvec and indexmap and concluded that due to limitations on cfg(...) bounds would require completely duplicating most of the code to still work on stable (so not upstreamed).
i then spent over a week recently on a proc macro which would quite literally just reproduce the entire content of an impl block in order to overcome the language's decision not to allow attributes upon individual elements when applying generic parameters with T<A, B, ...> the way you can to the individual value arguments of a function call.
but after realizing i'd missed several distinct categories of recursive AST modifications, then decided it was ridiculous to be reimplementing not just a parser but also a large enough subset of its typechecking semantics in order to make it safe enough for crates like smallvec to preserve the expectations around their unsafe calls. i could propose an expansion to the parser to allow attributes on type bounds, but parser changes are far more intrusive than adding a type to the standard library. in the meantime, c++ has not only successfully deprecated an element in their own std::allocator but completely removed it.
every other type of library in rust has to some conform to some form of versioning, except unstable features in the stdlib. incredibly, c++ has also solved this now, because it has a standards process, while the one "spec" rust has produced is strictly for their own compiler's output, while gccrs devs report the type signatures of instrinsic methods they need to implement changes over time (they said a T argument became a u32) in ways that essentially reduce to delegating to LLVM's behavior in practice.
and finally, when the incredible engineer who gave the world #embed spent their time developing a careful backwards-compatible proposal for metaprogramming for us https://mastodon.social/@tedmielczarek/114101345156483840, his rustconf keynote was instead cancelled for reasons no one will explain but led him to avoid any further interactions with our community.
so there appears to be some strange force in the rust community who can stop anything from moving forward while remaining invisible and unaccountable, and i think this points a severe governance issue that might be worth more scrutiny so rust can continue to become what it could be.
cmake has now produced a broken makefile which errors not during configuration but after that, the single thing i respected it for not doing along with the caching i also successfully broke. it has been downgraded from "reliable with terrible language" to "it works if i don't touch it"
oh now i remember. i was using the amazing super cool #embed interface thephd achieved for all of us (FOR THE USER!!!!!!) instead of google's almost (not quite) impressively ass-backwards impl in cmake which is quite literally a copy-paste of the top web search result for how to embed a file in cmake
i think pants is the right way to approach this and instead of getting overwhelmed by the right way to integrate spack for c/++ deps i will instead be able to focus entirely on making binaryen less obnoxious to build than with cmake
the specific reason i suddenly changed my mind here was because i saw possibly the most horrifyingly inefficient method to embed data into a source file which used cmake. and then i looked up #embed from jeanheyd and the documentation is great and the comments are going uwu at me on cppreference dot com. and it feels nice you know