Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OctaveDouble precision differences due to octave hex representation #3

Open
karlulricsimon opened this issue Jul 18, 2018 · 0 comments

Comments

@karlulricsimon
Copy link

In testIoOctaveDouble.java I added the following test

    @Test
    public void testPrecision() {
    	final OctaveEngine octave = new OctaveEngineFactory().getScriptEngine();
    	octave.eval("format long");
    	octave.eval("a = (1/3) * 0.29");
    	OctaveDouble a = octave.get(OctaveDouble.class, "a");
    	//This assertion will fail.
    	//On evaluation, the console will report "a =  0.0966666666666667",
    	//But retrieving the value back from the wrapper reports the native-hex value cast to a double
    	assertEquals(0.096666666666667, a.get(1), 0.0);
    }

Within the command window of octave, the value of the variable a will be "0.0966666666666667", but if the format is changed to 'native-hex' the value of a is 'be58f28b25bfb83f' which converts to "0.09666666666666665"

It seems that this is a bug in how the wrapper stores the variable value, though it could be that my expectations are incorrect.

I'm using Ubuntu 16.04
octaveInJava version 0.6.8
Octave version 4.3.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant