Reputation: 26455
While reading data from a ASCII file, I find myself doing something like this:
(a, b, c1, c2, c3, d, e, f1, f2) = (float(x) for x in line.strip().split())
c = (c1, c2, c3)
f = (f1, f2)
If I have a determinate number of elements per line (which I do)¹ and only one multi-element entry to unpack, I can use something like `(a, b, *c, d, e) = ...' (Extended iterable unpacking).
Even if I don't, I can of course replace one of the two multi-element entries from the example above by a starred component: (a, b, *c, d, e, f1, f2) = ...
.
As far as I can tell, the itertools
are not of immediate use here.
Are there any alternatives to the three-line code above that may be considered "more pythonic" for a reason I'm probably not aware of?
¹It's determinate but still varies per line, the pattern is too complicated for numpy
s functions loadtxt
or genfromtxt
.
Upvotes: 0
Views: 255
Reputation: 12755
If you use such statements really often, and want maximum flexibility and reusability of code instead of writing such patterns really often, I'd propose creating a small function for it. Just put it into some module and import it (you can even import the script I created).
For usage examples, see the if __name__=="__main__"
block. The trick is to use a list of group ids to group values of t
together. The length of this id list should be at least the same as the length of t
.
I will only explain the main concepts, if you don't understand anything, just ask.
I use groupby
from itertools. Even though it might not be straightforward how to use it here, I hope it might be understandable soon.
As key
-function I use a method I dynamically create via a factory-function. The main concept here is "closures". The list of group ids is being "attached" to the internal function get_group
. Thus:
The list is specific to each call to extract_groups_from_iterable
. You can use it multiple times, no globals are used
The state of this list is shared between subsequent calls to the same instance of get_group
(remember: functions are objects, too! So I have two instances of get_group
during the execution of my script.
Beside of this, I have a simple method to create either lists or scalars from the groups returned by groupby
.
That's it.
from itertools import groupby
def extract_groups_from_iterable(iterable, group_ids):
return [_make_list_or_scalar(g) for k, g in
groupby(iterable, _get_group_id_provider(group_ids))
]
def _get_group_id_provider(group_ids):
def get_group(value, group_ids = group_ids):
return group_ids.pop(0)
return get_group
def _make_list_or_scalar(iterable):
list_ = list(iterable)
return list_ if len(list_) != 1 else list_[0]
if __name__ == "__main__":
t1 = range(9)
group_ids1 = [1,2,3,4,5,5,6,7,8]
a,b,c,d,e,f,g,h = extract_groups_from_iterable(t1, group_ids1)
for varname in "abcdefgh":
print varname, globals()[varname]
print
t2 = range(15)
group_ids2 = [1,2,2,3,4,5,5,5,5,5,6,6,6,7,8]
a,b,c,d,e,f,g,h = extract_groups_from_iterable(t2, group_ids2)
for varname in "abcdefgh":
print varname, globals()[varname]
Output is:
a 0
b 1
c 2
d 3
e [4, 5]
f 6
g 7
h 8
a 0
b [1, 2]
c 3
d 4
e [5, 6, 7, 8, 9]
f [10, 11, 12]
g 13
h 14
Once again, this might seem like overkill, but if this helps you reducing your code, use it.
Upvotes: 1
Reputation: 309899
Why not just slice a tuple?
t = tuple(float(x) for x in line.split())
c = t[2:5] #maybe t[2:-4] instead?
f = t[-2:]
demo:
>>> line = "1 2 3 4 5 6 7 8 9"
>>> t = tuple(float(x) for x in line.split())
>>> c = t[2:5] #maybe t[2:-4] instead?
>>> f = t[-2:]
>>> c
(3.0, 4.0, 5.0)
>>> t
(1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0)
>>> c = t[2:-4]
>>> c
(3.0, 4.0, 5.0)
While we're on the topic of being pythonic, line.strip().split()
can always be safely written as line.split()
where line
is a string. split
will strip the whitespace for you when you don't give it any arguments.
Upvotes: 0