1

What is the computational complexity of Python3 default data-structures (list,dict,tuple...etc) ?

(Memory complexity issues are interesting as well.)

What I've found : http://wiki.python.org/moin/TimeComplexity - I am afraid it's about python 2, isn't it ?

2 Answers 2

2

The wiki entry applies to CPython for both Python 2 and 3. Those classes were not changed. Other implementations could be different, but they have to be similar if they want people to port code from CPython to xpython. A Python that used, for instance, linked-lists for the list type would require a considerably different programming sytle and code rearrangement to work well ;-).

In 3.3, the dict class is changed so that some information common to the _dict_s of multiple instances of one class is shared instead of duplicated. This mainly affects people with 100s or 1000s of instances of some class and will save about 1/3 of the dict space (I think), a constant factor that does not change the space complexity class.

More importantly, the (unicode) str class has been completely rewritten to use only as many bytes/character as are required. The result will generally be less space and time, but again the main effect should be on the multipliers ignored by O(xxx) notation.

Sign up to request clarification or add additional context in comments.

Comments

1

The computational complexity is related to the way of solving. Independently if Python 3 is faster or not, the complexity will be the same -- unless the things are not solved using algorithms that differ in principle.

Sometimes, the abstract data structure with the same name may differ (say Python 2 strings vs. Python 3 strings, or int, long in Python 2 vs generalized int in Python 3).

I did not check it, but my guess is that Python 2 and Python 3 do not differ in this sense.

4 Comments

That's What am I asking for: "Are there any differences in used algorithms - by principle" ;). Because I am asking about Computational Complexity, I assumed it's clear I mean "used algorithms". By design more methods in Python 3 return generators and iterators instead of lists or dictionaries -> this reduces memory complexity. I am curious about other improvements examples.
Well, but you could use the iterators even in Python 2. The only difference is that you had to be explicit. Then the question is whether you want to compare different algorithms. Iterators may make some algorithms less memory consuming, but the time complexity must be basically the same.
I wanted to know about any complexity differences - maybe you are right I formulated question badly.
I do not think it is a bad question. The code of that kind of importance is usually done the best possible way. The massive introduction of iterators as the usual way is rather a sign of learning from the past that fits with the way of how we really want to think about the problems. The consumed time can be changed this way in algorithms where you finish prematurely to consume the sequence. Anyway, worst case implies the same time complexity. The time complexity of an algorithm is a good hint about the quality, but it should not be the main goal.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.