profile
viewpoint
Vadim Petrochenkov petrochenkov Moscow, Russia

petrochenkov/cargo 0

The Rust package manager

petrochenkov/ccache 0

ccache – a fast compiler cache

petrochenkov/cmake-rs 0

Rust build dependency for running cmake

petrochenkov/dynamorio 0

Dynamic Instrumentation Tool Platform

petrochenkov/highfive 0

Github hooks to provide an encouraging atmosphere for new contributors

petrochenkov/reference 0

The Rust Reference

petrochenkov/rfcs 0

RFCs for changes to Rust

petrochenkov/rls 0

Repository for the Rust Language Server (aka RLS)

petrochenkov/rust 0

a safe, concurrent, practical language

issue commentrust-lang/rust

Stabilize --pretty=expanded

I'd like to revive this issue. cbindgen is an important tool that must work on stable, if it requires pretty-printing, then so be it.

I think providing --emit expanded working on best effort basis should be acceptable, if we document it. The pretty-printing output should at least parse successfully, pretty-printer can insert parentheses to maintain parsing priorities for code expanded from macros, if there are cases when it doesn't, then we should treat them as bugs.

joshtriplett

comment created time in 3 hours

issue closedrust-lang/rust

--pretty=expanded does not include macro definitions

macro_rules macro definitions are not included in the pretty output, they seem to just disappear. I suspect this has something to do with macro_rules technically looking like a macro invocation despite being defined in the compiler, so the "expansion" of a macro_rules definition is to disappear. As you might imagine, macros that generate macros can be one of the more painful things to debug, so the No. 1 macro debugging tool not working in this scenario makes things extra hard :p

closed time in 3 hours

jgarvin

pull request commentrust-lang/rust

Fix recursive nonterminal expansion during pretty-print/reparse check

@bors r+

Aaron1011

comment created time in 3 hours

issue commentrust-lang/rust

Glob Time Travel

Sigh, the issue already affects three stable releases - from 1.44 to 1.46. I'll try to prioritize it.

vlad20012

comment created time in 4 hours

issue commentrust-lang/rust

extern_types associated functions aren't visible from other crates

Fixed in https://github.com/rust-lang/rust/pull/77375.

crumblingstatue

comment created time in 4 hours

PR opened rust-lang/rust

rustc_metadata: Do not forget to encode inherent impls for foreign types

So I tried to move FFI interface for LLVM from rustc_codegen_llvm to rustc_llvm and immediately encountered this fascinating issue.

Fixes https://github.com/rust-lang/rust/issues/46665.

+25 -8

0 comment

3 changed files

pr created time in 4 hours

create barnchpetrochenkov/rust

branch : inherext

created branch time in 4 hours

pull request commentrust-lang/rust

Defer Apple SDKROOT detection to link time.

@bors r+

ehuss

comment created time in 5 hours

issue commentrust-lang/homu

Labels aren't applied when merge conflicts occur

If significant changes need to be done due to rebase, which is a rare case, then the reviewer can ask to rebase.

it seems like reviewers often wait to review until conflicts are resolved?

I don't think silently waiting for rebase is a behavior that should be encouraged in reviewers, since rebasing over small changes doesn't affect reviewing at all. If rebase is wanted, then the reviewer can ask for it and change the label.

camelid

comment created time in 7 hours

pull request commentrust-lang/rust

[experiment] Add MustClone and impl Copy+MustClone on Range{,From,Inclusive}.

One root regression due to #![deny(missing_copy_implementations)], no regressions due to Copy bounds. Other regressions are spurious.

eddyb

comment created time in 8 hours

issue commentrust-lang/rust

Add `-Zrandomize-layout` flag to better detect code that rely on unspecified behavior related to memory layout

Randomizing the layout was being suggested starting from the original struct layout RFC (https://github.com/rust-lang/rfcs/blob/master/text/0079-undefined-struct-layout.md#alternatives), it's unfortunate that it's still not implemented.

I've found one similar existing issue for this - https://github.com/rust-lang/rust/issues/38550.

marmeladema

comment created time in 11 hours

issue openedeqrion/cbindgen

"Unhandled const definition" on a constant with a type alias (typedef) type

type MyInt = i32;

const MY_CONST: MyInt = 10;

cbindgen will produce a "Unhandled const definition" error on MY_CONST because MyInt is not a "can_handle" type https://github.com/eqrion/cbindgen/blob/1fc4cb072422e722f8bcb3af5766f84587a180c6/src/bindgen/ir/constant.rs#L393-L395 https://github.com/eqrion/cbindgen/blob/1fc4cb072422e722f8bcb3af5766f84587a180c6/src/bindgen/ir/constant.rs#L366-L374

The "can_handle" restriction looks artificial and unnecessary, if it is commented out, then the correct constant definition is generated.

// C++
static const MyInt MY_CONST = 10;

// C
#define MY_CONST 10

In both cases it doesn't matter what MyInt actually is, cbindgen just can put it into the constant definition textually and it will work in most cases if the right hand side can be converted to C/C++.

created time in 11 hours

issue commentrust-lang/homu

Labels aren't applied when merge conflicts occur

See https://github.com/rust-lang/rust-central-station/pull/34 ("Do not change S-waiting-on-review to S-waiting-on-author on merge conflict").

camelid

comment created time in 18 hours

issue closedalexcrichton/jobserver-rs

unix: Check that file descriptors obtained from `--jobserver-auth=R,W` actually refer to a pipe

cargo and rustc (and other tools using jobserver) will fail with a "failed to acquire jobserver token: early EOF on jobserver pipe" error in the next scenario reported in https://github.com/rust-lang/rust/issues/46981#issuecomment-681090839 as ICE in rustc:

  • make executes a subprocess (e.g. shell) with the jobserver pipe closed (marked with FD_CLOEXEC), but the MAKEFLAGS env var will still contain --jobserver-auth=3,4 (or some other pair of integers) in the subprocess. This is where make is wrong, if it closes the descriptors for a subprocess, then it should probably clean up MAKEFLAGS for it as well, but it doesn't, so we have to live with make possibly providing garbage descriptors.
  • the subprocess (e.g. shell) opens some files so descriptors 3 and 4 are taken again now, but refer to entirely unrelated files.
  • the shell runs cargo, which sees that 3 and 4 are open, concludes that they refer to a jobserver, and fails when trying to read from them.

jobserver could be more resilient in the face of garbage descriptors if it checked not only that the descriptors are valid, but also that they actually refer to a pipe rather than random unrelated files.

closed time in a day

petrochenkov

issue commentalexcrichton/jobserver-rs

unix: Check that file descriptors obtained from `--jobserver-auth=R,W` actually refer to a pipe

Ok, I think it's better to be consistent with make and keep the current behavior, so I'll close this issue.

However, what needs to be done is a better diagnostic in the erroneous cases.

  • If cargo/rustc sees one of the MAKEFLAGS env vars, but the descriptors in it are closed, then it needs to emit a warning similar to make's "jobserver unavailable: using -j1. Add '+' to parent make rule.". Right now it will silently create a new jobserver leaving the user unaware about issues in their build system.
  • If cargo/rustc fails to acquire a token from something that it thinks is a jobserver, we need to check whether it's a pipe and report if it's not, again making the user aware of misconfiguration. Also, rustc shouldn't ICE on this, only produce an error - build system misconfiguration is certainly not an internal compiler errors.

To do this the jobserver crate needs to produce slightly more detailed error types from from_env and acquire than it does now. I'll try to prototype these improvements for rustc and make a PR.

petrochenkov

comment created time in a day

issue closedrust-lang/rust

`test\debuginfo\pretty-std-collections-hash.rs` fails with Visual Studio 2017 toolchain

These two lines in src\etc\natvis\libstd.natvis confuse both cdb and GUI debugger available inside VS:

<Item Name="{static_cast&lt;tuple&lt;$T1, $T2&gt;*&gt;(base.table.ctrl.pointer)[-(i + 1)].__0}">static_cast&lt;tuple&lt;$T1, $T2&gt;*&gt;(base.table.ctrl.pointer)[-(i + 1)].__1</Item>
<Item>static_cast&lt;$T1*&gt;(map.base.table.ctrl.pointer)[-(i + 1)]</Item>

cdb gives this error

Unable to find type 'tuple<u64,u64> *' for cast.

GUI debugger just skips the visualization and only shows raw representation.


It is possible to fix visualization inside VS GUI debugger by replacing static_cast with C-style casts

<Item Name="{((tuple&lt;$T1, $T2&gt;*)base.table.ctrl.pointer)[-(i + 1)].__0}">((tuple&lt;$T1, $T2&gt;*)base.table.ctrl.pointer)[-(i + 1)].__1</Item>
<Item>(($T1*)map.base.table.ctrl.pointer)[-(i + 1)]</Item>

, but I haven't found a way to make cdb eat this yet.


Tests for other debuggers can use requirement directives like

// min-gdb-version: 8.2

If cdb from VS2017 toolchain is unable to perform computations required for visualizing hash set/map, then we need to introduce such requirements for cdb as well.

cc @Amanieu @MaulingMonkey

closed time in a day

petrochenkov

issue commentrust-lang/rust

`test\debuginfo\pretty-std-collections-hash.rs` fails with Visual Studio 2017 toolchain

This was fixed in #76389 and #76390.

petrochenkov

comment created time in a day

pull request commentrust-lang/rust

[WIP] Token-based outer attributes handling

With this PR we perform transformations like cfg-expansion on both tokens and AST simultaneously, and then make sure that they are still in sync with the pretty-print-reparse hack.

In the end state, I think, we should treat the AST part as a cache (used for performance only) and always reparse it from tokens after any transformations (which will only be performed on tokens). The reparse hack will then be eliminated.

This is kinda opposite to what we do now by treating the tokens as a "cache" and regenerating them from AST when necessary (via pretty-printing), but unlike the "AST -> tokens", the "tokens -> AST" conversion is always lossless.

Having only tokens without any AST cache should be functionally correct, but it's pretty reasonable to predict that it will be a performance regression because we'll have to parse same tokens at least twice and maybe more.

Aaron1011

comment created time in a day

pull request commentrust-lang/rust

Fix recursive nonterminal expansion during pretty-print/reparse check

Ok, marking as blocked then.

Aaron1011

comment created time in a day

pull request commentrust-lang/rust

Fix recursive nonterminal expansion during pretty-print/reparse check

I'm basically ready to merge this since no regressions are caused by issues in rustc. Alternatively, we could wait for crater results in https://github.com/rust-lang/rust/pull/77135 and land the both changes together.

Aaron1011

comment created time in a day

pull request commentrust-lang/rust

Don't fire `const_item_mutation` lint on writes through a pointer

@bors r+

Aaron1011

comment created time in a day

issue commentrust-lang/rust

Segfault in _Backtrace_Unwind

@bossmc

gcc's? LLVM's?

I don't know, but looks like they should be compatible. At least on Ubuntu both gcc and clang link to gcc's objects. The gcc ones can be just copied by rustbuild if the license allows, the LLVM ones can be used if we need to build them by ourselves (they are a part of compiler-rt).

bossmc

comment created time in a day

Pull request review commentrust-lang/rust

compiletest: Support ignoring tests requiring missing LLVM components

 trait Copy { }  //x86_64: define win64cc void @has_efiapi //i686: define void @has_efiapi+//aarch64: define void @has_efiapi //arm: define void @has_efiapi+//riscv: define void @has_efiapi

Maybe. I simply reverted https://github.com/rust-lang/rust/pull/66084 here and didn't make any changes.

petrochenkov

comment created time in 2 days

PullRequestReviewEvent

pull request commentrust-lang/rust

Ensure that all LLVM components requested by tests are available on CI

@bors r=Mark-Simulacrum rollup

petrochenkov

comment created time in 2 days

push eventpetrochenkov/rust

Vadim Petrochenkov

commit sha 9340ee4380dcc9b81e0afb1ef5518730b064a78a

Ensure that all LLVM components requested by tests are available on CI

view details

push time in 2 days

pull request commentrust-lang/rust

Ensure that all LLVM components requested by tests are available on CI

thread 'main' panicked at 'missing LLVM component: my-nonexistent-component', src/tools/compiletest/src/header.rs:216:25

Excellent.

petrochenkov

comment created time in 2 days

push eventpetrochenkov/rust

David Wood

commit sha 01f65afa4adff6dfbea84621e6851c028aaa7159

diag: improve closure/generic parameter mismatch This commit improves the diagnostic when a type parameter is expected and a closure is found, noting that each closure has a distinct type and therefore could not always match the caller-chosen type of the parameter. Signed-off-by: David Wood <david@davidtw.co>

view details

Alexis Bourget

commit sha 8aae1eee94f481bd955cff473deae1f03c313451

Move cell exterior test into library unit tests

view details

Alexis Bourget

commit sha f69c5aa428efdbc01685c3d06e63fedd3120e8e5

Move more tests using Cell to unit tests

view details

Alexis Bourget

commit sha af44a2a857618150b180dabe9c3383a3911b3640

move 'cell does not clone' test

view details

Alexis Bourget

commit sha fc152cd67e0b6d3f11f49eae43183d03a3b8bf17

move 'test zip ...' tests

view details

Alexis Bourget

commit sha 85b2d9bf6f2b04ae8996050b2fb276bd58cd92de

fmt

view details

Alexis Bourget

commit sha 949c96660c32ff9b19a639b4be607938c2262653

move format! interface tests

view details

Alexis Bourget

commit sha ed52c7bb7516f12f74704e20457d5046378a49fc

Move deref-lval test

view details

Alexis Bourget

commit sha ac39debeba2b63a39a3833e2d7451f0b1f95b5f2

Move panic safety traits tests

view details

Alexis Bourget

commit sha 8904921c1d6b3636f4352f9dd6d4875132b89998

Move array cycle test

view details

Alexis Bourget

commit sha 275eed7eb1d45e8173b932e2abfdae2201d2cf62

Move vec-slice-drop test

view details

Alexis Bourget

commit sha 6bc0357dadd9d41a8166d4c2ab8a27c0bb8150d3

Move vec-cycle test

view details

Alexis Bourget

commit sha f6a4189d05f8bc7091450289f7285819ebdd3c62

Move vec-cycle-wrapped test

view details

Alexis Bourget

commit sha 5be843fc54f80817c88438efa097a4ba81d4aa9e

Move format-ref-cell test

view details

Ivan Tham

commit sha 1994cee61a4aea9dc46bb3d0323c8290621eda33

Add alias for iterator fold fold is known in python and javascript as reduce, not sure about inject but it was written in doc there.

view details

Ivan Tham

commit sha ea0065ad4f96153539476e2f3df83bae96018ede

Reposition iterator doc alias reduce before inline

view details

Alexis Bourget

commit sha a61b9638bbbb48f9c2fde0ccbbcf03e64494ea0f

review: fix nits and move panic safety tests to the correct place

view details

Dylan MacKenzie

commit sha bb6c249f99c736b6986232c0c2eeec1d058585af

Speed up `IntRange::from_pat` Previously, this method called the more general `pat_constructor` function, which can return other pattern variants besides `IntRange`. Then it throws away any non-`IntRange` variants. Specialize it so work is only done when it could result in an `IntRange`.

view details

Dylan MacKenzie

commit sha c4d8089f001cf6323a1bb8fac08b7f31de33171b

Revert "Add an unused field of type `Option<DefId>` to `ParamEnv` struct." This reverts commit ab83d372ed5b1799d418afe83c468e4c5973cc34.

view details

bjorn3

commit sha 71bc62b9f696ae83ef1713bd96054c92cda9f27f

Add option to pass a custom codegen backend from a driver

view details

push time in 2 days

pull request commentrust-lang/rust

Add associated constant `BITS` to all integer types

Ah, here's what happened:

FIXME(#11621): Should be deprecated once CTFE is implemented in favour of calling the mem::size_of function.

Anyway, I also agree that they should be usize because they are almost always used in usize context.

m-ou-se

comment created time in 2 days

pull request commentrust-lang/rust

Add associated constant `BITS` to all integer types

Wait a second, what happened? We had the BITS constants and they had the type usize - https://github.com/rust-lang/rust/pull/23832.

m-ou-se

comment created time in 2 days

pull request commentrust-lang/rust

expand: Stop normalizing `NtIdent`s before passing them to built-in macros

@bors r=varkor

petrochenkov

comment created time in 2 days

push eventpetrochenkov/rust

Raoul Strackx

commit sha a13239dac2b4b50e389a3e12c638d164e620ea2f

generic ret hardening test

view details

Raoul Strackx

commit sha cd31f40b6f59c8966a09e7f1e2d6c3c5ed1570f1

generic load hardening test

view details

Raoul Strackx

commit sha bca8e07ef4048a7ca912d157156268f958369ef2

rust inline assembly lvi hardening test

view details

Raoul Strackx

commit sha 4d1d0c6bd7e9b5dda47691729fe15a49091d2e4d

skeleton check module level assembly

view details

Raoul Strackx

commit sha 947d7238e0db03146d33d6b3354231d7e3384735

Adding checks for assembly files in libunwind

view details

Raoul Strackx

commit sha 6db05904f6e64714537b37a6ca82010aa6cb46d0

LVI test std lib

view details

Raoul Strackx

commit sha 0526e750cd4e88fbb24ea92e809dfb6eefbf8fa0

started using cc crate

view details

Raoul Strackx

commit sha bdf81f508d0b79751063b93f4d151ec6a90d2a09

test hardening C inline assembly code (cc crate)

view details

Raoul Strackx

commit sha 64811ed5a590ef4c89c09f4d04d3cea11251da52

testing c++ code (cc crate)

view details

Raoul Strackx

commit sha d8a7904e06e77baf137b6713f9bf79f74ae6edfe

LVI hardening tests for cmake

view details

Raoul Strackx

commit sha 72a8e6b1931cac7e1c716688501c324cb77c9d09

Adding checks for module level assembly

view details

Raoul Strackx

commit sha 8ca26cca2919977cba79e7436c4f72fb6661ea9b

Building libunwind with new CMakeLists. The old CMakeLists file of libunwind used the C compiler to compile assembly files. This caused such code not to be hardened.

view details

Raoul Strackx

commit sha 7d3c3fdc1d57d555c726f1caa444e9dd5a02e142

cleaning up code

view details

Raoul Strackx

commit sha 159d11fb069fca88056bc1b8194d520489e3e921

Patch compilation test helpers for sgx platform

view details

Dylan MacKenzie

commit sha 61d86fa06ca4d7c93109fc857300abbd25a19a0a

Check for missing const-stability attributes in `stability` This used to happen as a side-effect of `is_min_const_fn`, which was subtle.

view details

Dylan MacKenzie

commit sha 11bfc60a4b5fa111527a69e5f511cb69ae5325af

Change error in `fn_queries` to `delay_span_bug` This should be caught by the new check in `rustc_passes`. At some point, this function will be removed entirely.

view details

Dylan MacKenzie

commit sha 76c6f2dc3f0769653e80c58bd3f288594fa11dc6

No need to call `is_min_const_fn` for side-effects

view details

Dylan MacKenzie

commit sha 6ce178f60eec86cfd9245e6289598938df519359

Test for missing const-stability attributes

view details

Oliver Scherer

commit sha 455f284496976c5a77e7f1cbdf1f382dc0a6d245

Deduplicate and generalize some (de/)serializer impls

view details

Tomasz Miąsko

commit sha 9b5835ec7973a52f2e9d6f4ed9a2181bebfdc399

liveness: Remove redundant debug logging The IrMaps::add_variable already contains debug logging. Don't duplicate it.

view details

push time in 2 days

pull request commentrust-lang/rust

expand: Stop normalizing `NtIdent`s before passing them to built-in macros

Yep, this PR is to blame.

petrochenkov

comment created time in 2 days

Pull request review commentrust-lang/rust

Ensure that all LLVM components requested by tests are available on CI

 docker \   --env TOOLSTATE_REPO \   --env TOOLSTATE_PUBLISH \   --env CI_JOB_NAME="${CI_JOB_NAME-$IMAGE}" \+  --env COMPILETEST_NEEDS_ALL_LLVM_COMPONENTS \

I misinterpreted what this command does. Moved the variable setting to src\ci\run.sh.

petrochenkov

comment created time in 2 days

PullRequestReviewEvent

push eventpetrochenkov/rust

Raoul Strackx

commit sha a13239dac2b4b50e389a3e12c638d164e620ea2f

generic ret hardening test

view details

Raoul Strackx

commit sha cd31f40b6f59c8966a09e7f1e2d6c3c5ed1570f1

generic load hardening test

view details

Raoul Strackx

commit sha bca8e07ef4048a7ca912d157156268f958369ef2

rust inline assembly lvi hardening test

view details

Raoul Strackx

commit sha 4d1d0c6bd7e9b5dda47691729fe15a49091d2e4d

skeleton check module level assembly

view details

Raoul Strackx

commit sha 947d7238e0db03146d33d6b3354231d7e3384735

Adding checks for assembly files in libunwind

view details

Raoul Strackx

commit sha 6db05904f6e64714537b37a6ca82010aa6cb46d0

LVI test std lib

view details

Raoul Strackx

commit sha 0526e750cd4e88fbb24ea92e809dfb6eefbf8fa0

started using cc crate

view details

Raoul Strackx

commit sha bdf81f508d0b79751063b93f4d151ec6a90d2a09

test hardening C inline assembly code (cc crate)

view details

Raoul Strackx

commit sha 64811ed5a590ef4c89c09f4d04d3cea11251da52

testing c++ code (cc crate)

view details

Raoul Strackx

commit sha d8a7904e06e77baf137b6713f9bf79f74ae6edfe

LVI hardening tests for cmake

view details

Raoul Strackx

commit sha 72a8e6b1931cac7e1c716688501c324cb77c9d09

Adding checks for module level assembly

view details

Raoul Strackx

commit sha 8ca26cca2919977cba79e7436c4f72fb6661ea9b

Building libunwind with new CMakeLists. The old CMakeLists file of libunwind used the C compiler to compile assembly files. This caused such code not to be hardened.

view details

Raoul Strackx

commit sha 7d3c3fdc1d57d555c726f1caa444e9dd5a02e142

cleaning up code

view details

Raoul Strackx

commit sha 159d11fb069fca88056bc1b8194d520489e3e921

Patch compilation test helpers for sgx platform

view details

Oliver Scherer

commit sha 455f284496976c5a77e7f1cbdf1f382dc0a6d245

Deduplicate and generalize some (de/)serializer impls

view details

Tomasz Miąsko

commit sha 9b5835ec7973a52f2e9d6f4ed9a2181bebfdc399

liveness: Remove redundant debug logging The IrMaps::add_variable already contains debug logging. Don't duplicate it.

view details

Tomasz Miąsko

commit sha 2fb1564457b47bd31087e2aba1b8eb6f15c000ef

liveness: Remove redundant fields for a number of live nodes and variables

view details

Tomasz Miąsko

commit sha 70f150b51e7d13e3bcd8977ff124a348057cf7ef

liveness: Delay conversion from a symbol to a string until linting

view details

Tomasz Miąsko

commit sha 141b91da6cb756ff5f36eebe9eee65922e295876

liveness: Inline contents of specials struct

view details

Tomasz Miąsko

commit sha 49d1ce00f30a10e607dc30506d3e890d1efb6309

liveness: Remove unnecessary local variable exit_ln

view details

push time in 2 days

pull request commentrust-lang/rust

compiletest: Support ignoring tests requiring missing LLVM components

@Aaron1011

It might be a good idea to add a flag to assert that all needed components are available, to ensure that we don't accidentally stop running these tests entirely.

Addressed in https://github.com/rust-lang/rust/pull/77280.

petrochenkov

comment created time in 3 days

pull request commentrust-lang/rust

Ensure that all LLVM components requested by tests are available on CI

(I've added an intentionally "mistyped" component my-nonexistent-component to test the change on CI, will remove it once this is approved.)

petrochenkov

comment created time in 3 days

PR opened rust-lang/rust

Ensure that all LLVM components requested by tests are available on CI

Addresses https://github.com/rust-lang/rust/pull/75064#issuecomment-667722652

I used an environment variable because passing a command line option all the way from CI to compiletest would be just too much hassle for this task. I added a new variable, but any of the already existing ones defined by CI could be used instead. r? @Mark-Simulacrum

+7 -3

0 comment

3 changed files

pr created time in 3 days

create barnchpetrochenkov/rust

branch : llvmcomp

created branch time in 3 days

pull request commentrust-lang/rust

Rewrite `collect_tokens` implementations to use a flattened buffer

this makes it easy to modify the captured stream at an arbitrary depth, simply by replacing tokens in the flattened buffer. This is important for PR #76130, which requires us to track information about attributes along with tokens.

Tokens that need to be replaced in #76130 should all be at the same level, right? So, the replacement is not fundamentally impossible, just needs some tree walking to get to the right level?

Aaron1011

comment created time in 3 days

PR opened rust-lang/rust

expand: Stop normalizing `NtIdent`s before passing them to built-in macros

Built-in macros should be able to deal with NtIdents in the input by themselves like any other parser code.

You can't imagine how bad mutable AST visitors are, especially if they are modifying tokens. This is one step towards removing token visiting from the visitor infrastructure (https://github.com/rust-lang/rust/pull/77271 also works in this direction.)

+12 -35

0 comment

2 changed files

pr created time in 3 days

pull request commentrust-lang/rust

Rewrite `collect_tokens` implementations to use a flattened buffer

Hmm, why didn't this show up in https://github.com/rust-lang/rust/pull/76130?

Aaron1011

comment created time in 3 days

create barnchpetrochenkov/rust

branch : interpid

created branch time in 3 days

pull request commentrust-lang/rust

[experiment] Expand `NtExpr` tokens only in key-value attributes

@craterbot check

petrochenkov

comment created time in 3 days

issue commentrust-lang/rust

ICE in macro: doc meta with expr on an item, string concat, stringify!(...)

I started working on this in https://github.com/rust-lang/rust/pull/77271.

u32i64

comment created time in 3 days

pull request commentrust-lang/rust

Allow path as value in name-value attribute

I'll try to prioritize https://github.com/rust-lang/rust/issues/55414#issuecomment-554005412 and implement it myself sooner.

I started working on this in https://github.com/rust-lang/rust/pull/77271.

dtolnay

comment created time in 3 days

pull request commentrust-lang/rust

[experiment] Expand `NtExpr` tokens only in key-value attributes

@bors try

petrochenkov

comment created time in 3 days

PR opened rust-lang/rust

[experiment] Expand `NtExpr` tokens only in key-value attributes

Implement the experiment described in https://github.com/rust-lang/rust/issues/55414#issuecomment-554005412

+121 -37

0 comment

6 changed files

pr created time in 3 days

create barnchpetrochenkov/rust

branch : notokenexp

created branch time in 3 days

pull request commentrust-lang/rust

Unconditionally capture tokens for attributes.

Blocked on https://github.com/rust-lang/rust/pull/77250.

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Rewrite `collect_tokens` implementations to use a flattened buffer

 impl<'a> Parser<'a> {             token_cursor: TokenCursor {                 frame: TokenCursorFrame::new(DelimSpan::dummy(), token::NoDelim, &tokens),                 stack: Vec::new(),-                cur_token: None,-                collecting: None,+                // Will get overwritten when we bump the parser below+                collecting_buf: vec![(Token::new(token::Eof, DUMMY_SP), Spacing::Alone)],
                collecting_buf: vec![(Token::dummy(), Spacing::Alone)],
Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Rewrite `collect_tokens` implementations to use a flattened buffer

 impl<'a> Parser<'a> {         &mut self,         f: impl FnOnce(&mut Self) -> PResult<'a, R>,     ) -> PResult<'a, (R, TokenStream)> {-        // Record all tokens we parse when parsing this item.-        let tokens: Vec<TreeAndSpacing> = self.token_cursor.cur_token.clone().into_iter().collect();-        debug!("collect_tokens: starting with {:?}", tokens);--        // We need special handling for the case where `collect_tokens` is called-        // on an opening delimeter (e.g. '('). At this point, we have already pushed-        // a new frame - however, we want to record the original `TokenTree::Delimited`,-        // for consistency with the case where we start recording one token earlier.-        // See `TokenCursor::next` to see how `cur_token` is set up.-        let prev_depth =-            if matches!(self.token_cursor.cur_token, Some((TokenTree::Delimited(..), _))) {-                if self.token_cursor.stack.is_empty() {-                    // There is nothing below us in the stack that-                    // the function could consume, so the only thing it can legally-                    // capture is the entire contents of the current frame.-                    return Ok((f(self)?, TokenStream::new(tokens)));-                }-                // We have already recorded the full `TokenTree::Delimited` when we created-                // our `tokens` vector at the start of this function. We are now inside-                // a new frame corresponding to the `TokenTree::Delimited` we already recoreded.-                // We don't want to record any of the tokens inside this frame, since they-                // will be duplicates of the tokens nested inside the `TokenTree::Delimited`.-                // Therefore, we set our recording depth to the *previous* frame. This allows-                // us to record a sequence like: `(foo).bar()`: the `(foo)` will be recored-                // as our initial `cur_token`, while the `.bar()` will be recored after we-                // pop the `(foo)` frame.-                self.token_cursor.stack.len() - 1-            } else {-                self.token_cursor.stack.len()-            };-        let prev_collecting =-            self.token_cursor.collecting.replace(Collecting { buf: tokens, depth: prev_depth });+        let start_pos = self.token_cursor.collecting_buf.len() - 1;+        let prev_collecting = std::mem::replace(&mut self.token_cursor.is_collecting, true);          let ret = f(self); -        let mut collected_tokens = if let Some(collecting) = self.token_cursor.collecting.take() {-            collecting.buf+        let err_stream = if ret.is_err() {

Why can't you do let ret = f(self)?; and pass the error further, like it was done previously?

If it's necessary to update collecting_buf, then why can't we set err_stream to None in the error case and return some non-empty stream? It will be dropped anyway.

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Rewrite `collect_tokens` implementations to use a flattened buffer

 impl TokenCursor {                 self.frame = frame;                 continue;             } else {-                return Token::new(token::Eof, DUMMY_SP);+                (TokenTree::Token(Token::new(token::Eof, DUMMY_SP)), Spacing::Alone)             }; -            // Don't set an open delimiter as our current token - we want-            // to leave it as the full `TokenTree::Delimited` from the previous-            // iteration of this loop-            if !matches!(tree.0, TokenTree::Token(Token { kind: TokenKind::OpenDelim(_), .. })) {-                self.cur_token = Some(tree.clone());-            }--            if let Some(collecting) = &mut self.collecting {-                if collecting.depth == self.stack.len() {-                    debug!(-                        "TokenCursor::next():  collected {:?} at depth {:?}",-                        tree,-                        self.stack.len()-                    );-                    collecting.buf.push(tree.clone())+            match tree.0.clone() {
            match &tree.0 {

Some premature cloning.

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Rewrite `collect_tokens` implementations to use a flattened buffer

 pub use path::PathStyle;  use rustc_ast::ptr::P; use rustc_ast::token::{self, DelimToken, Token, TokenKind};-use rustc_ast::tokenstream::{self, DelimSpan, TokenStream, TokenTree, TreeAndSpacing};+use rustc_ast::tokenstream::{self, DelimSpan, Spacing, TokenStream, TokenTree, TreeAndSpacing}; use rustc_ast::DUMMY_NODE_ID; use rustc_ast::{self as ast, AttrStyle, AttrVec, Const, CrateSugar, Extern, Unsafe}; use rustc_ast::{Async, MacArgs, MacDelimiter, Mutability, StrLit, Visibility, VisibilityKind}; use rustc_ast_pretty::pprust;-use rustc_errors::{struct_span_err, Applicability, DiagnosticBuilder, FatalError, PResult};+use rustc_errors::{+    struct_span_err, Applicability, DiagnosticBuilder, FatalError, Handler, PResult,+};

Nit: could you avoid multi-line imports?

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Rewrite `collect_tokens` implementations to use a flattened buffer

 pub fn emit_unclosed_delims(unclosed_delims: &mut Vec<UnmatchedBrace>, sess: &Pa         }     } }++/// Converts a flattened iterator of tokens (including open and close delimiter tokens)+/// into a `TokenStream`, creating a `TokenTree::Delimited` for each matching pair+/// of open and close delims.+fn make_token_stream(+    handler: &Handler,+    tokens: impl Iterator<Item = (Token, Spacing)>,+) -> TokenStream {+    #[derive(Debug)]+    struct FrameData {+        open: Span,+        inner: Vec<(TokenTree, Spacing)>,+    }+    let mut stack = vec![FrameData { open: DUMMY_SP, inner: vec![] }];+    for tree in tokens {
    for (token, spacing) in tokens {
Aaron1011

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent

pull request commentrust-lang/rust

might_permit_raw_init: also check aggregate fields

Is it okay to just update the cargotest to a newer version?

No idea. Maybe someone from cargo or infra teams knows.

RalfJung

comment created time in 4 days

pull request commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

@craterbot check

Aaron1011

comment created time in 4 days

pull request commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

LGTM, needs a crater run.

@bors try

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

 pub trait PrintState<'a>: std::ops::Deref<Target = pp::Printer> + std::ops::Dere      fn to_string(&self, f: impl FnOnce(&mut State<'_>)) -> String {         let mut printer = State::new();+        printer.insert_extra_parens = self.insert_extra_parens();
        let printer = State { insert_extra_parens: self.insert_extra_parens(), .. State::new() };
Aaron1011

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent

pull request commentrust-lang/rust

might_permit_raw_init: also check aggregate fields

Something in cargotest failed with "thread panicked while panicking. aborting.".

RalfJung

comment created time in 4 days

pull request commentrust-lang/rust

Unconditionally capture tokens for attributes.

r? @petrochenkov

Aaron1011

comment created time in 4 days

pull request commentrust-lang/rust

Test more attributes in test issue-75930-derive-cfg.rs

r? @petrochenkov @bors r+

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

[WIP] Token-based outer attributes handling

 impl DelimSpan {         self.open.with_hi(self.close.hi())     } }++#[derive(Clone, Debug, Default, Encodable, Decodable)]+pub struct PreexpTokenStream(pub Lrc<Vec<(PreexpTokenTree, Spacing)>>);++#[derive(Clone, Debug, Encodable, Decodable)]+pub enum PreexpTokenTree {+    Token(Token),+    Delimited(DelimSpan, DelimToken, PreexpTokenStream),+    OuterAttributes(AttributesData),

Inlining AttributeData will improve readability here, IMO.

Aaron1011

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent

pull request commentrust-lang/rust

[WIP] Token-based outer attributes handling

We need to trim this PR a bit first. Right now I don't really understand what happens here because it's too large.

The next changes are improvements on their own and can be landed in separate PRs:

  • Newly added tests and test cases.
  • Adding tokens to statements.
  • Adding tokens to ast::Attributes and eliminating prepend_attrs (at least partially).
  • Token collection refactoring.

Additionally:

  • Field / variant / etc span changes are responsible for the majority of the diff in test outputs. Can we keep the old spans in AST and not extend them to comma separators?
  • parse_outer_attributes and similar changes involving self -> this dominate the parser diff. Can they be factored out into a separate commit (the first one). Other commits contain some significant back and forth, so I ended up reading the diff for the whole PR instead of separate commits, so it would be better to squash them.
Aaron1011

comment created time in 4 days

pull request commentrust-lang/rust

Defer Apple SDKROOT detection to link time.

LGTM, except that I'd move add_apple_sdk from inside add_pre_link_args to the outside.

I'm tempted to remove the use of Result for built-in target definitions, since I don't think they should be fallible.

This change affects all target specs, better do it in a separate PR.

ehuss

comment created time in 4 days

pull request commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

Could you avoid moving code

Moving foo_to_string to inherent methods also qualify as a code move here. Moving just the file renaming into a separate commit doesn't really change the diff (and its readability) compared to the old state.

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

 pub fn tokentree_probably_equal_for_proc_macro(     first: &TokenTree,     other: &TokenTree,     sess: &ParseSess,+    relaxed_delim_match: bool, ) -> bool {     match (first, other) {         (TokenTree::Token(token), TokenTree::Token(token2)) => {             token_probably_equal_for_proc_macro(token, token2)         }         (TokenTree::Delimited(_, delim, tts), TokenTree::Delimited(_, delim2, tts2)) => {-            delim == delim2 && tokenstream_probably_equal_for_proc_macro(&tts, &tts2, sess)+            let delim_match = if relaxed_delim_match {+                // `NoDelim` delimiters can appear in the captured tokenstream, but not+                // in the reparsed tokenstream. Allow them to match with anything, so+                // that we check if the two streams are structurally equivalent.+                delim == delim2 || *delim == DelimToken::NoDelim || *delim2 == DelimToken::NoDelim+            } else {+                delim == delim2+            };+            delim_match+                && tokenstream_probably_equal_for_proc_macro(&tts, &tts2, sess, relaxed_delim_match)         }         _ => false,

Can avoid TokenStream::from by using something like

tokens.len() == 1 && token_probably_equal_for_proc_macro(tokens[0], reparsed_token)

though.

Aaron1011

comment created time in 4 days

PullRequestReviewEvent

Pull request review commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

 pub fn tokentree_probably_equal_for_proc_macro(     first: &TokenTree,     other: &TokenTree,     sess: &ParseSess,+    relaxed_delim_match: bool, ) -> bool {     match (first, other) {         (TokenTree::Token(token), TokenTree::Token(token2)) => {             token_probably_equal_for_proc_macro(token, token2)         }         (TokenTree::Delimited(_, delim, tts), TokenTree::Delimited(_, delim2, tts2)) => {-            delim == delim2 && tokenstream_probably_equal_for_proc_macro(&tts, &tts2, sess)+            let delim_match = if relaxed_delim_match {+                // `NoDelim` delimiters can appear in the captured tokenstream, but not+                // in the reparsed tokenstream. Allow them to match with anything, so+                // that we check if the two streams are structurally equivalent.+                delim == delim2 || *delim == DelimToken::NoDelim || *delim2 == DelimToken::NoDelim+            } else {+                delim == delim2+            };+            delim_match+                && tokenstream_probably_equal_for_proc_macro(&tts, &tts2, sess, relaxed_delim_match)         }         _ => false,

Possible cases here are:

  • ⟪ token_tree ⟫ -> token_tree
  • ⟪ tokens ⟫ -> ( tokens ) (on the second reparse)

So, I think a more precise version of this function would be

pub fn tokentree_probably_equal_for_proc_macro(
    token: &TokenTree,
    reparsed_token: &TokenTree,
    sess: &ParseSess,
    relaxed_delim_match: bool,
) -> bool {
    match (token, reparsed_token) {
        (TokenTree::Token(token), TokenTree::Token(reparsed_token)) => {
            token_probably_equal_for_proc_macro(token, reparsed_token)
        }
        (
            TokenTree::Delimited(_, delim, tokens),
            TokenTree::Delimited(_, reparsed_delim, reparsed_tokens),
        ) if delim == reparsed_delim => {
            tokenstream_probably_equal_for_proc_macro(tokens, reparsed_tokens, sess)
        }
        (TokenTree::Delimited(_, DelimToken::NoDelim, tokens), reparsed_token) => {
            if relaxed_delim_match {
                if let TokenTree::Delimited(_, DelimToken::Paren, reparsed_tokens) = reparsed_token {
                    if tokenstream_probably_equal_for_proc_macro(tokens, reparsed_tokens, sess) {
                        return true;
                    }
                }
            }
            tokenstream_probably_equal_for_proc_macro(tokens, TokenStream::from(reparsed_token), sess)
        }
        _ => false,
    }
}
Aaron1011

comment created time in 4 days

PullRequestReviewEvent

pull request commentrust-lang/rust

pretty-print-reparse hack: Rename some variables for clarity

@bors rollup

petrochenkov

comment created time in 4 days

push eventpetrochenkov/rust

Vadim Petrochenkov

commit sha 275bf626f615f7f154249606ad369d6c142801a5

pretty-print-reparse hack: Rename some variables for clarity

view details

Vadim Petrochenkov

commit sha fe3e5aa729ee34749ae730bbb5fd9c906877b82a

pretty-print-reparse hack: Remove an impossible case Delimiters cannot appear as isolated tokens in a token stream

view details

push time in 4 days

PR opened rust-lang/rust

pretty-print-reparse hack: Rename some variables for clarity

This will also make it easier to make the comparisons asymmetric.

Also one impossible case is removed.

r? @Aaron1011

+23 -20

0 comment

1 changed file

pr created time in 4 days

create barnchpetrochenkov/rust

branch : reparse

created branch time in 4 days

Pull request review commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

 pub fn nt_to_tokenstream(nt: &Nonterminal, sess: &ParseSess, span: Span) -> Toke     // modifications, including adding/removing typically non-semantic     // tokens such as extra braces and commas, don't happen.     if let Some(tokens) = tokens {+        // If the streams match, then the AST hasn't been modified. Return the captured+        // `TokenStream`.         if tokenstream_probably_equal_for_proc_macro(&tokens, &tokens_for_real, sess) {             return tokens;         }++        // The check failed. This time, we pretty-print the AST struct with parenthesis+        // inserted to preserve precedence. This may cause `None`-delimiters in the captured+        // token stream to match up with inserted parenthesis in the reparsed stream.+        let source_with_parens = pprust::nonterminal_to_string(nt);+        let filename_with_parens = FileName::macro_expansion_source_code(&source_with_parens);+        let tokens_with_parens = parse_stream_from_source_str(
        let reparsed_tokens_with_parens = parse_stream_from_source_str(
Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

 pub fn tokentree_probably_equal_for_proc_macro(             token_probably_equal_for_proc_macro(token, token2)         }         (TokenTree::Delimited(_, delim, tts), TokenTree::Delimited(_, delim2, tts2)) => {-            delim == delim2 && tokenstream_probably_equal_for_proc_macro(&tts, &tts2, sess)+            // `NoDelim` delimiters can appear in the captured tokenstream, but not+            // in the reparsed tokenstream. Allow them to match with anything, so+            // that we check if the two streams are structurally equivalent.+            (delim == delim2 || *delim == DelimToken::NoDelim || *delim2 == DelimToken::NoDelim)

Could this relaxed comparison be enabled by a flag rather than applied always? It only makes sense when we are trying to reparse and compare for the second time.

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

+#[cfg(test)]+mod tests;++pub mod state;+pub use state::{print_crate, AnnNode, Comments, PpAnn, PrintState, State};++use rustc_ast as ast;+use rustc_ast::token::{Nonterminal, Token, TokenKind};+use rustc_ast::tokenstream::{TokenStream, TokenTree};++pub fn nonterminal_to_string_no_extra_parens(nt: &Nonterminal) -> String {+    let mut state = State::new();+    state.insert_extra_parens = false;+    state.nonterminal_to_string(nt)
    State { insert_extra_parens: false, ..State::new() }.nonterminal_to_string(nt)
Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

 pub fn nt_to_tokenstream(nt: &Nonterminal, sess: &ParseSess, span: Span) -> Toke     };      // FIXME(#43081): Avoid this pretty-print + reparse hack-    let source = pprust::nonterminal_to_string(nt);+    // Pretty-print the AST struct without inserting any parenthesis+    // beyond those explicitly written by the user (e.g. `ExpnKind::Paren`).+    // The resulting stream may have incorrect precedence, but it's only+    // ever used for a comparison against the capture tokenstream.+    let source = pprust::nonterminal_to_string_no_extra_parens(nt);     let filename = FileName::macro_expansion_source_code(&source);     let tokens_for_real = parse_stream_from_source_str(filename, source, sess, Some(span));
    let reparsed_tokens = parse_stream_from_source_str(filename, source, sess, Some(span));
Aaron1011

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent

pull request commentrust-lang/rust

Refactor AST pretty-printting to allow skipping insertion of extra parens

@Aaron1011 Could you avoid moving code (e.g. to pprust/state.rs) and changing code in the same commit? It's hard to review this way. The move can be split into a separate commit.

Aaron1011

comment created time in 4 days

pull request commentrust-lang/rust

Encode less metadata for proc-macro crates

r=me with nits addressed.

Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Encode less metadata for proc-macro crates

 impl<'a, 'tcx> CrateMetadataRef<'a> {      fn all_def_path_hashes_and_def_ids(&self) -> Vec<(DefPathHash, DefId)> {         let mut def_path_hashes = self.def_path_hash_cache.lock();-        (0..self.num_def_ids())-            .map(|index| {-                let index = DefIndex::from_usize(index);-                (self.def_path_hash_unlocked(index, &mut def_path_hashes), self.local_def_id(index))-            })-            .collect()+        let mut def_index_to_data = |index| {+            (self.def_path_hash_unlocked(index, &mut def_path_hashes), self.local_def_id(index))+        };+        if self.root.is_proc_macro_crate() {
        if let Some(data) = &self.root.proc_macro_data {
Aaron1011

comment created time in 4 days

Pull request review commentrust-lang/rust

Encode less metadata for proc-macro crates

 provide! { <'tcx> tcx, def_id, other, cdata,         })     }     proc_macro_decls_static => {-        cdata.root.proc_macro_decls_static.map(|index| {-            DefId { krate: def_id.krate, index }+        cdata.root.proc_macro_data.as_ref().map(|index| {
        cdata.root.proc_macro_data.as_ref().map(|data| {
Aaron1011

comment created time in 4 days

PullRequestReviewEvent
PullRequestReviewEvent

pull request commentrust-lang/rust

Late link args order

@bors r+

mati865

comment created time in 4 days

PR closed rust-lang/rust

Allow path as value in name-value attribute A-attributes A-proc-macros S-waiting-on-review T-lang

This commit allows proc macros to take inert attributes that look like #[path = ident] or #[path = p::p::path]. Previously the rhs was only permitted to be an unsuffixed numeric literal or the identifier special cases true and false. However, parenthesized #[path(key = ident)] and #[path(key = p::p::path)] were already allowed; see https://docs.rs/cxx/0.4.4/cxx/attr.bridge.html for one attribute currently using that style.

My use case for this is related to C++ namespace support in https://github.com/dtolnay/cxx.

- #[cxx::bridge(namespace = co::stuff)]  // previously only one namespace can be specified
+ #[cxx::bridge]
  mod ffi {
+     #[namespace = co::server]  // now can allow different namespace for different stuff
      extern "C++" {
          ...
+     }

+     #[namespace = co::client]
+     extern "C++" {
          ...
      }
  }

Chesterton's fence: why was the rhs syntax restricted originally? I believe this was to leave open the possibility of #[m1 = "...", m2 = "..."] syntax which would be equivalent to #[m1 = "..."] #[m2 = "..."] but more compact (this might still happen). Allowing an "arbitrary tokenstream" on the rhs would rule that out. In this PR I've still kept the rhs syntax quite restricted, adding only mod-style paths and remaining compatible with compact syntax being introduced later.

+114 -14

5 comments

13 changed files

dtolnay

pr closed time in 4 days

pull request commentrust-lang/rust

Allow path as value in name-value attribute

Do you think we shouldn't take this PR without the bigger change in #55414 (comment)?

Yes. I'll try to prioritize https://github.com/rust-lang/rust/issues/55414#issuecomment-554005412 and implement it myself sooner.

In the meantime #[path(key = p::p::path)] can be written through a string literal as #[path(key = "p::p::path")], so supporting paths here is not very urgent.

dtolnay

comment created time in 4 days

pull request commentrust-lang/rust

Late link args order

error: .github/workflows/ci.yml is not up to date caused by: src/ci/github-actions/ci.yml and .github/workflows/ci.yml are different

mati865

comment created time in 4 days

pull request commentrust-lang/rust

Fix documentation highlighting in ty::BorrowKind

@bors r+ rollup

jyn514

comment created time in 5 days

pull request commentrust-lang/rust

[experiment] Add MustClone and impl Copy+MustClone on Range{,From,Inclusive}.

I like the general direction here, but I also agree with https://github.com/rust-lang/rfcs/issues/2848#issuecomment-699090192 that it feels more like an "attribute + lint" case rather than a "trait + move error (?)", especially given that the trait is not used in generic context, only on concrete instances of ranges.

eddyb

comment created time in 5 days

pull request commentrust-lang/rust

might_permit_raw_init: also check aggregate fields

@bors r+

RalfJung

comment created time in 5 days

more