-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DateTime serialization should preserve the timezone #43
Comments
Always open for good contributions; that's how this module gets improved. |
I have looked into the code of DateTimeSerializer and I think the problem could be solved, if the DEFAULT_FORMAT constant would be changed to no longer include the |
@if12b017 Perhaps, but wouldn't this be backwards-incompatible change? That is, change existing behavior in a way that existing code would start working differently, possibly causing issues. |
That is true, but anyone who upgrades his libraries to a newer version should be aware of the changes. |
I think it all boils down to the concept of time, time zone is a meta information, the information itself never changes, which is the time in milliseconds, an instant of time is the same everywhere in the world, it is just the meta info that changes, in fact, by expressing such meta info as UTC you would be losing such meta data and preserving only the instant of time data. We have dealt with This special case is only valid for We always disable the feature |
In the following example we know we set the time zone for Australia (+10:00) but the String stored in Riak is its UTC representation which loses the zone meta data for that
Which adds the overhead of creating another |
Yes, I have got the same line in my code several times. |
I'm facing this issue as well. I want to preserve the timezone and don't want to convert to UTC. Interestingly, on deserializing an ISO formatted DateTime string, you can set |
I think the default behaviour should be to preserve on the ISO date/time String the |
One wrinkle is that DateTime can hold a full time zone (e.g. America/Chicago) where as the ISO format will constrain that to an offset which cannot be converted back to a full zone. So, some of the time zone meta data will be lost regardless. |
Quick note: this question REALLY needs to be discussed on dev mailing list, since it relates to multiple datatypes, not just Joda. Specifically core JDK data types from 1.0, as well as new 1.8 Date API is involved. |
I did propose #44 to try to deal with this. I do plan to bring it up on the dev list, and I have some changes I'd like to make to that PR....I suppose this is my +1 that this retaining the time zone across de/serialization operations of DateTime is important. |
Right, I am just saying that I don't want to get a change in that would end up having to be reverted. |
I've now got #44 where I think it's correct at least as far as the time zone details are concerned. I'm not sure it's correct as far as defaults and overriding formats and those kinds of issues. I imagine there's room for improvement even if it is correct. I've also posted to the dev list (http://markmail.org/message/73aw5vcvsb63b7lt). |
As per my comment to the PR #44; Jackson 2.6 adds
which at least allows enabling/disabling of zone-id inclusion over offset. One possibility here is that proposed format (like "0/UTC") would actually be used when writing is (otherwise) by timestamp (numeric). Conversely, when using string representation, we'd use full ISO-8601(-like) String, followed by suffix, perhaps "/zoneid" if that is common enough standard? |
I guess it has always been like this, but it doesn't make much sense to me. DateTime is basically a LocalDateTime with a DateTimeZone attached. Generally when people are using DateTime they want time zone info, otherwise they would use a LocalDateTime.
I think as a result of this design decision, many people have to implement custom DateTime functionality to deal with time zone.
What jackson should be doing is to write out the time zone id along with the millis for DateTimes.
The text was updated successfully, but these errors were encountered: