# UVa 575

## Summary

Convert a number from base ${\displaystyle 2^{k+1}-1}$ to decimal.

## Explanation

Standard base-conversion to decimal. Start at the Least Significant Bit (LSB) and work your way to the Most Significant Bit (MSB).

## Gotcha's

• Input of 0 terminates.

## Notes

• The output will fit nicely into a 32-bit int.

## Input

```10120
200000000000000000000000000000
10
1000000000000000000000000000000
11
100
11111000001110000101101102000
0
```

```44
2147483646
3
2147483647
4
7
1041110737
```