Hi there,
I have decided to learn java programming, rather than working though a book which i find very dull i have just dived in by writing (trying to write) an application.
my application is for my other passion in life darts. so basically what the application does at the moment is calculates a players score from 501 down to 0 finishing on a double. this works fine but when i try to calculate the players 1 dart average i'm getting the wrong figure. To test this I did a 9 dart leg and the 1 dart average should be 55.67 however my code comes up with 55.0
I know that i'm probably missing something really obvious, i'm just at a loss to know what it is?
here is the code, appologies if anyone thinks its messy, it's the first programming i have done for about 14 years!
import java.util.Scanner; public class DartApp { public static void main(String[] args){ //create a scanner Scanner scanner = new Scanner(System.in); int total = 501; //to be changed depending on game int nthrows = 0; //number of throws to work out average float tdartave; //three dart average calculator do{ System.out.print("enter player 1 score: "); int score = scanner.nextInt(); total = total - score; System.out.println("score is " +total); //calculate number of throws if (total>0){ nthrows = nthrows +3;} else{ System.out.print("which dart was the winning dart 1,2,3 : "); int wdart = scanner.nextInt(); nthrows = nthrows + wdart; System.out.println("darts thrown = " +nthrows);} }while (total>0); tdartave = (501 / nthrows); System.out.println("three dart average = " +tdartave); } }
cheers for any help that can be offered, and sorry if I appear thick
Mike