## Dienstag, 18. Juni 2013

### Testing a rounding problem

Floating numbers are characterized by their precision. That's why floating numbers are always inexact though a specification for calculating that kind of numbers should define its precision. It can be one digit up to 15 digits or whatever.
Anyway. The question is, how to test the defined precision? In the following example I expect a precision of 2 digits. For understanding reasons I show the code first followed by the test. The class is simple circle stuff with a method, which returns the calculated area:
```class Circle
attr_accessor :radius
def initialize radius=1
@radius = radius
end
def area
radius * radius * Math::PI
end
end
```
There are the 2 accessors for the instance variable "radius" and the constructor "Circle#initialize" with a default parameter for a unit circle. And at least there is the algorithm for calculating the area of a circle in "Circle#area", which returns the most possible precise result. The RSpec test is:
```# circle_spec.rb
require './circle.rb'
describe Circle do
describe "#area" do
it "should calculate the area of a circle" do
radius = 2
circle = Circle.new radius
circle.area.should be_within(0.01).of(12.56)
end
end
end
```
"Matchers#be_within" ensures that the precision should be 2 digits, nevertheless the method returns a more precise number.

Supported by Ruby 1.9.3