Yes, it’s irritating enough that I’m going to blog about it.
Most programming languages have some version of slice() implemented in their design. The principle for the slice method/function/whatever is self explanatory: It makes a slice of a larger set of data.
HOW it’s implemented varies. It could change the original variable or set of data that you are working on. It could create NEW variable or set of data that copies that subset of the data. Conceivably, it could do both, like a cake: literally take the data out of the original set, leaving it smaller, and making a new object/variable/whatever with the subset you defined.
Every implementation I’ve seen defines the slice it makes using 1 or 2 numbers. The first number is an indication of where to start the slice, and if the second number is there, it indicates where the slice ends, (otherwise, the slice ends when the data slice ends.
Now, before we go on, I have to explain how programming languages and computers number things. If you understand 0-indexing used by most languages, skip a bit to get to my gripe. Otherwise, keep going.
In most computer/programming/scripting languages, counting starts at 0. So let’s say you have an array (an array is a group of things, more specificity in that definition varies from language to language). This array is a list of fruits. In this list we have:
That list is longer than we actually need, but it works. If the array has each of those stored in that order, “Banana” has an index of 0. “Apple” is 1, and so forth until we get to “Lemon” which has an index of 7. The LENGTH of the list is 8 objects. And programming languages will tell you it’s 8 objects long. But they’re going to index it 0-7. Banana is the first object, with an index of 0. Lemon is the eighth object, with an index of 7.
It’s not necessarily obvious, but there are good reasons for it which I won’t go into here.
Now, back to slice(). The first number telling you when to start is typically either a) the index number of the object (a range of 0-7 in our example) or the number of the sequence in which it appears (a range of 1-8). You have to know which one to use or you’ll get unexpected results, but once you know you just memorize it.
The second number, if it appears, can also be either the index number or the sequence number. But there’s also a third option! That “end number” might actually be a measure of how long the slice is.
So if you want to get Orange, Mango, and Grape, there are four common sense ways of expressing it.
- Index method: fruits.slice(2,4) (because Orange is index 2, and Grape is index 4)
- Sequence method: fruits.slice(3,5) (because Orange is third in the list and Grape is fifth)
- Index + length method: fruits .slice(2,3) (because Orange is index 2, and you want 3 items from the list)
- Sequence + length method: fruits.slice(3,3) (because Orange is the third item in the list, and you want 3 items from the list.
Personally, I would prefer it if they’d all just use the Index method, but I don’t get to decide these things.
But what REALLY IRRITATES me is what jQuery does with slice. It uses the index for the start item and the sequence number for the end item. So if we wanted Orange, Mango, and Grape, our expression would be fruits.slice(2,5). It uses to completely distinct numbering systems instead of just one, which makes it look like there are either 4 or 5 items in the slice, when there are only 3.
It’s not consistent, and that’s stupid. Thanks jQuery. I hate you now.
(No I don’t. Come back. Why you gotta make me hurt you, baby?)