-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
serde deserializer fails deserialize to untagged enums which has integer(i128) values #19
Comments
For me using Adding a feature is certainly possible, though I generally prefer taking the time to think about it a bit, whether there's a way so that we don't end up with an infinite list of knobs one could tweak (and end up in maintenance/support hell). Hence I wonder if it makes sense to change the Integer type by default to Another way to look at this is: this is another case where Serde untagged enums are a problem. So should we just add a new feature as proposed here and document it properly that if you used untagged enums, that you should enable that features? @Stebalien Do you have any thoughts on this? |
Thanks for your thoughts on this! As a library user (if we don't consider compatibility, etc.), would the ideal form be to use |
So, the issue with just i64 is that I wonder if we should have an /// Doesn't have to be a trait, could just be type parameters but... that's a lot of type parameters.
pub trait Primitives {
type Integer;
type String;
type Map;
// ...
}
pub enum IpldGeneric<P: Primitives> {
Integer(P::Integer),
String(P::String),
// ...
}
type Ipld = IpldGeneric<DefaultPrimitives>; That should give users the tools to do whatever they need. But it's also adding complexity. (anyways, please ignore the |
@Stebalien That's an interesting idea. Having used Rust for a while, I wonder if that adds complexity at unintended places. I wonder if that makes interoperability between systems/libraries a problem. But maybe it's also a win as this way it breaks in kind of visible ways and not just subtle due to random feature flags. @sugyan would you be open to give that idea a chance and try it out. I'd imagine that atrium is a large enough system to get an idea whether this ends up in unusable complexities or is quite easy to use. |
@vmx I personally don't care what implementation changes are made, as long as they solve the enum deserialize failures that are currently troubling atrium anyway. I think it would be a great implementation to allow users to choose the type they want to use more flexibly. However, I can't imagine that there would be any case where the user would want to change the primitive type other than to make the integer i64 or larger. |
As issue #19 outlines, it might make sense to use a different internal type for integer values. This commit makes all primitive types generic, so that you can choose your own. Though you're discouraged from doing so. Usually using the IPLD type should be enough and leads to the best interoperability. Thanks @Stebalien for the idea of this implementation. Fixes #19.
As issue #19 outlines, it might make sense to use a different internal type for integer values. This commit makes all primitive types generic, so that you can choose your own. Though you're discouraged from doing so. Usually using the IPLD type should be enough and leads to the best interoperability. Thanks @Stebalien for the idea of this implementation. Fixes #19.
@sugyan I've created a PR with @Stebalien's suggestion at #21. It would be cool if you could try it out and also let me know if you ran into any weird issues. My hope would be that things would be smooth. You would add something like this to your code: #[derive(Clone)]
pub struct PrimitivesI64;
impl Primitives for PrimitivesI64 {
type Bool = bool;
type Integer = i64;
type Float = f64;
type String = String;
type Bytes = Vec<u8>;
type Link = Cid;
}
type Ipld = IpldGeneric<PrimitivesI64>; |
Unfortunately, in my case this does not seem to work as expected. https://github.com/sugyan/atrium/actions/runs/10372924349/job/28716796955?pr=212#step:3:187 Because the only implementation of |
By default the IPLD Integer kind is represented internally as an `i128`. Serde has problems with untagged enums that contain i128 types. Therefore a feature flag called `integer-max-i64` is introduced, which reduces the internal integer representation to `i64`. This flag should be used with caution as e.g. not all valid DAG-CBOR data can now be represented. Closes #19.
By default the IPLD Integer kind is represented internally as an `i128`. Serde has problems with untagged enums that contain i128 types. Therefore a feature flag called `integer-max-i64` is introduced, which reduces the internal integer representation to `i64`. This flag should be used with caution as e.g. not all valid DAG-CBOR data can now be represented. Closes #19.
By default the IPLD Integer kind is represented internally as an `i128`. Serde has problems with untagged enums that contain i128 types. Therefore a feature flag called `integer-max-i64` is introduced, which reduces the internal integer representation to `i64`. This flag should be used with caution as e.g. not all valid DAG-CBOR data can now be represented. Closes #19.
@sugyan sorry for taking so long and thanks for giving #21 a try. Although it's kind of a nice idea to solve it with traits, the added complexity gets too much out of control for my taste. Thinking about it again, I think a feature flag might be the simplest and easiest to follow solution. @sugyan if you could try out #22 it would be great. It should be as simple as enabling the |
@vmx Thank you for continuing to work on solving the problem! It seems to be very difficult to solve the problem with traits... |
Similar to the issue previously reported on ipld/serde_ipld_dagcbor#21, when trying to deserialize to an Internally tagged or Untagged
enum
, it seems that theserde
internally callsdeserialize_any
, in which casei128
is not supported.The following test fails.
I think this is a
serde
's unresolved issue, so it may not be the responsibility of this library.However, we can work around this problem by changing
Ipld::Integer
to havei64
instead ofi128
.my experimental branch:
sugyan@539530d
Since the IPLD specification requires minimum support for values up to 2^53, the change to i64 is destructive but still meets the specification.
https://ipld.io/design/tricky-choices/numeric-domain/#integers
Is it possible to change it so that, for example, "if a certain feature flag is specified, the Ipld::Integer of i64 is used"?
My background is that I am developing a library
atrium
, and I would like to provide a function to holdunknown
values received from the server as anIpld
and deserialize them to an arbitrary type on the user side. However, I am having trouble converting fromIpld
to other types due to this problem...The text was updated successfully, but these errors were encountered: