I am trying to learn python and I have tried to convert a code snippet in js to python. I have created a function in js to calculate GCD as follows:
// program to find the GCD of two integers
let gcd
function GCD() {
// take input
const number1 = prompt('Enter a first positive integer: ')
const number2 = prompt('Enter a second positive integer: ')
// looping from 1 to number1 and number2
for (let i = 1; i <= number1 && i <= number2; i++) {
if( number1 % i == 0 && number2 % i == 0) {
gcd = i
}
}
// display the gcd
document.write(`GCD of ${number1} and ${number2} is ${gcd}.`)
}
GCD()
If I supply the first integer as 9 and the second integer of 3, I get the GCD as 3.
I have tried to convert this to a python program as follows:
def gcd():
gcd = 0
num1 = int(input("Enter a first positive integer: "))
num2 = int(input("Enter a second positive integer: "))
for i in range(1, i<=num1 and i<=num2):
if num1 % i == 0 and num2 % i == 0:
gcd = i
print(f"GCD of: {num1} and: {num2} is {gcd}")
gcd()
But I don't know how to get the for loop in python quite right. If I change the for statement to:
def gcd():
gcd = 0
num1 = int(input("Enter a first positive integer: "))
num2 = int(input("Enter a second positive integer: "))
for i in range(1, num1 and num2):
if num1 % i == 0 and num2 % i == 0:
gcd = i
print(f"GCD of: {num1} and: {num2} is {gcd}")
gcd()
and I give the same input as before of 9 and 3, I get GCD of 1
for i in range(1, min(num1, num2)+1):