-
Notifications
You must be signed in to change notification settings - Fork 43
Description
Consider the following example:
import ServiceLifecycle
var count = 0
let maxCount = 1_000_000
print("Counting up to \(maxCount) elements...")
while count < maxCount {
// Create an arbitrary stream
let stream = AsyncStream(unfolding: {
return Int.random(in: 0..<Int.max)
})
// Each call to cancelOnGracefulShutdown() causes a leak
for await _ in stream.cancelOnGracefulShutdown() {
count += 1
break
}
}
print("Done!")
In here, a new stream is created and destroyed on each loop, then we consume a single element off of the stream (which is just a random number), then exit the stream and do it again.
I am seeing that this code shows that the call to cancelOnGracefulShutdown()
causes a memory leak. Here is what I see when I run this example under heaptrack:
Looking at the top-down allocations, it looks like something in the AsyncAlgorithms
module is getting allocated and not released:
If I move the creation of the stream out of the while loop and do this:
// Create a stream
let stream = AsyncStream(unfolding: {
return Int.random(in: 0..<Int.max)
})
while count < maxCount {
// Each call to cancelOnGracefulShutdown() causes a leak
for await _ in stream.cancelOnGracefulShutdown() {
count += 1
break
}
}
I still see the exact behavior in heaptrack. It could be that this is a memleak in AsyncAlgorithms
itself, but I am not sure based on looking at the output what exactly is causing the leak to occur.
Maybe this is "unintended usage" for cancelOnGracefulShutdown()
, but it seems like this could become a problem of an application has to create streams and consume them constantly while still supporting graceful shutdown. Also, this could become a problem for my apps because I use cancelOnGracefulShutdown()
everywhere, and having this memory leak is concerning.