
stdClassįor completeness let's switch to a stdClass object. (The slight variation is most likely due to randomly generated numbers of different length.) Those are the values to beat. The runtime is pretty stable and the memory usage is constant, as expected. So about 9.4 seconds and a half GB of memory to work with associative arrays. Here's the results for our baseline test: Associative array (Sorting) Run Then I run three more times in a row and average the results. To account for natural jitter in the process, I ran each test once to prime it (although on the CLI that shouldn't matter, but it doesn't hurt). Printf("Runtime: %s\nMemory: %s\nSize: %s\n", $stop - $start, $memory, strlen($ser)) For that we use this slightly different script: random_int(1, 500), These giant lookup tables are often built once and serialized to a database for cache lookup, so knowing the trade off there is also useful. For that, we're sorting the array twice, once by the key (which should be a no-op) and once by the array itself, using a custom sort function.Īs a second test, I also want to check the serialization size. The goal is to measure the memory used by all of those nested arrays as well as the time it takes to process them. You know who you are.) 1 million items is somewhat larger than a typical use case but we want to stress test it, so go big or go home. (Although some systems like to expose these anonymous structs as though they were an API, which is one of the most developer-hostile API designs I have ever seen. This "anonymous struct" is very typical of the type of data structure I'm talking about, which is often assigned to a private property within an object and only accessed within it.

That is, we build an array of 1 million items, where each item is an associative array containing an int and a short string. Printf("Runtime: %s\nMemory: %s\n", $stop - $start, $memory) The baseline test looks like this: random_int(1, 500), You will almost certainly get different absolute numbers than I do but the relative values should be about the same. (Always do that before running benchmarks!) I have as much background processing turned off as I could manage, though on modern systems runtime optimizations mean there will always be some variation and jitter. My test system is a Lenovo X1 Carbon 2017 Edition, i5-7300U CPU 2.60GHz, 16 GB of RAM, running Kubuntu 18.04. So like any good scientist I decided to test it: What I found will shock you! Benchmark environment

(Fight me!) But really, who bothers with defining a class for something that simple?īut that got me wondering, is that common pattern really, you know, good? Are objects actually more expensive or harder to work with than arrays? Or, more to the point, is that true today on PHP 7 given all the optimizations that have happened over the years compared with the bad old days of PHP 4? Maybe you document it in a docblock, or maybe you're a lazy jerk and you don't. This is a pretty common pattern in PHP: You have some simple one-off internal data structure so you make an informal struct using PHP associative arrays.

Php associative array inside an associative array code#
The other day I was working on some sample code to test out an idea that involved an object with an internal nested array.
