Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Debugger not showing correct decimal values #738

Closed
ruicraveiro opened this issue Aug 31, 2016 · 3 comments
Closed

Debugger not showing correct decimal values #738

ruicraveiro opened this issue Aug 31, 2016 · 3 comments

Comments

@ruicraveiro
Copy link

Environment data

dotnet --info output:
.NET Command Line Tools (1.0.0-preview2-003121)

Product Information:
Version: 1.0.0-preview2-003121
Commit SHA-1 hash: 1e9d529bc5

Runtime Environment:
OS Name: Mac OS X
OS Version: 10.11
OS Platform: Darwin
RID: osx.10.11-x64

VS Code version:
C# Extension version:
1.4.0

Steps to reproduce

Create a simple class with a decimal property, instantiate that class, optionally assign a value to the property and checkout the value the debugger (variables and watch) shows for the property. It is different.

Expected behaviour

The value shown on the watch for the property should be the correct one.

Actual behaviour

Debugger watch is showing wrong values. The value of the property itself is correct, though, because when the application actually uses it, the result is the expected.

demo

@gregg-miskelly
Copy link
Contributor

This is due to https://github.com/dotnet/coreclr/issues/6935.

@gregg-miskelly
Copy link
Contributor

The CoreCLR team has checked in a fix for their 1.1 release

@ruicraveiro
Copy link
Author

Thanks for the feedback!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants