Instagram has introduced Immortal Objects – PEP-683 – to Python. Now, objects can bypass reference count checks and live throughout the entire execution of the runtime, unlocking exciting avenues f…
I have not read the PEP itself or the PEPs that they claim to simplify, but this feel like a very bad idea that only really benefits Meta and a few other mega servers. It is enabling a micro-optimization that is only usable in a niche use case (forking long running processes with shared memory). Unfortunately, it is making all other python applications, on average, 2% slower. This is a minor regression but it hurts everyone and benefits almost no one.
Shouldn’t this be useful for pandas and other data analytics libraries? That would be useful beyond meta and similar. A lot of mid to large sized orgs use those.
I have not read the PEP itself or the PEPs that they claim to simplify, but this feel like a very bad idea that only really benefits Meta and a few other mega servers. It is enabling a micro-optimization that is only usable in a niche use case (forking long running processes with shared memory). Unfortunately, it is making all other python applications, on average, 2% slower. This is a minor regression but it hurts everyone and benefits almost no one.
Shouldn’t this be useful for pandas and other data analytics libraries? That would be useful beyond meta and similar. A lot of mid to large sized orgs use those.