I have code like this:
const loop1 = length => new Promise((resolve, reject) => {
try {
let b = 0
for (let i = 0; i < length; i++) b = b + i
resolve(b)
} catch (e) {
reject(e)
}
})
const loop2 = length => new Promise((resolve, reject) => {
try {
let b = 0
for (let i = 0; i < length; i++) b = b + i
resolve(b)
} catch (e) {
reject(e)
}
})
const startTime = new Date().getTime()
loop1(10000000000).then(result => {
const endTime = new Date().getTime()
const duration = endTime - startTime
console.log(`loop1: ${duration}`, result)
}).catch(error => console.log('loop1 error:', error))
loop2(1).then(result => {
const endTime = new Date().getTime()
const duration = endTime - startTime
console.log(`loop2: ${duration}`, result)
}).catch(error => console.log('loop2 error:', error))
const endTime = new Date().getTime()
const duration = endTime - startTime
console.log('duration', duration)
Why are the results like this?:
root@ububtu:~$ node .
duration 15539
loop1: 15545 49999999990067860000
loop2: 15545 0
Why isn't the result like this?:
root@ububtu:~$ node .
duration 0
loop2: 5 0
loop1: 15545 49999999990067860000
Why should wait loop1 for give the result? Why not passed loop1 to give result loop2 first? And why the duration time is not < 1 seconds but more than 15 seconds?