← Index
NYTProf Performance Profile   « line view »
For starman worker -M FindBin --max-requests 50 --workers 2 --user=kohadev-koha --group kohadev-koha --pid /var/run/koha/kohadev/plack.pid --daemonize --access-log /var/log/koha/kohadev/plack.log --error-log /var/log/koha/kohadev/plack-error.log -E deployment --socket /var/run/koha/kohadev/plack.sock /etc/koha/sites/kohadev/plack.psgi
  Run on Fri Jan 8 14:31:06 2016
Reported on Fri Jan 8 14:31:38 2016

Filename/usr/share/perl5/DBIx/Class/Storage/DBIHacks.pm
StatementsExecuted 11344 statements in 35.1ms
Subroutines
Calls P F Exclusive
Time
Inclusive
Time
Subroutine
871119.2ms24.0msDBIx::Class::Storage::DBIHacks::::_resolve_column_infoDBIx::Class::Storage::DBIHacks::_resolve_column_info
87113.20ms4.73msDBIx::Class::Storage::DBIHacks::::_collapse_condDBIx::Class::Storage::DBIHacks::_collapse_cond
1305412.13ms2.13msDBIx::Class::Storage::DBIHacks::::CORE:matchDBIx::Class::Storage::DBIHacks::CORE:match (opcode)
87111.32ms6.06msDBIx::Class::Storage::DBIHacks::::_extract_fixed_condition_columnsDBIx::Class::Storage::DBIHacks::_extract_fixed_condition_columns
87111.09ms1.28msDBIx::Class::Storage::DBIHacks::::_collapse_cond_unroll_pairsDBIx::Class::Storage::DBIHacks::_collapse_cond_unroll_pairs
8711986µs1.05msDBIx::Class::Storage::DBIHacks::::_resolve_ident_sourcesDBIx::Class::Storage::DBIHacks::_resolve_ident_sources
17421104µs104µsDBIx::Class::Storage::DBIHacks::::CORE:sortDBIx::Class::Storage::DBIHacks::CORE:sort (opcode)
11154µs391µsDBIx::Class::Storage::DBIHacks::::__ANON__[:869]DBIx::Class::Storage::DBIHacks::__ANON__[:869]
11135µs481µsDBIx::Class::Storage::DBIHacks::::_extract_order_criteriaDBIx::Class::Storage::DBIHacks::_extract_order_criteria
11133µs44µsDBIx::Class::Storage::DBIHacks::::BEGIN@16DBIx::Class::Storage::DBIHacks::BEGIN@16
11115µs21µsDBIx::Class::Storage::DBIHacks::::BEGIN@10DBIx::Class::Storage::DBIHacks::BEGIN@10
11112µs17µsDBIx::Class::Storage::DBIHacks::::BEGIN@14DBIx::Class::Storage::DBIHacks::BEGIN@14
11111µs25.8msDBIx::Class::Storage::DBIHacks::::BEGIN@13DBIx::Class::Storage::DBIHacks::BEGIN@13
11111µs30µsDBIx::Class::Storage::DBIHacks::::BEGIN@19DBIx::Class::Storage::DBIHacks::BEGIN@19
11110µs26µsDBIx::Class::Storage::DBIHacks::::BEGIN@17DBIx::Class::Storage::DBIHacks::BEGIN@17
1118µs26µsDBIx::Class::Storage::DBIHacks::::BEGIN@18DBIx::Class::Storage::DBIHacks::BEGIN@18
1118µs11µsDBIx::Class::Storage::DBIHacks::::BEGIN@11DBIx::Class::Storage::DBIHacks::BEGIN@11
1117µs44µsDBIx::Class::Storage::DBIHacks::::BEGIN@20DBIx::Class::Storage::DBIHacks::BEGIN@20
1117µs193µsDBIx::Class::Storage::DBIHacks::::BEGIN@21DBIx::Class::Storage::DBIHacks::BEGIN@21
0000s0sDBIx::Class::Storage::DBIHacks::::__ANON__[:316]DBIx::Class::Storage::DBIHacks::__ANON__[:316]
0000s0sDBIx::Class::Storage::DBIHacks::::_adjust_select_args_for_complex_prefetchDBIx::Class::Storage::DBIHacks::_adjust_select_args_for_complex_prefetch
0000s0sDBIx::Class::Storage::DBIHacks::::_columns_comprise_identifying_setDBIx::Class::Storage::DBIHacks::_columns_comprise_identifying_set
0000s0sDBIx::Class::Storage::DBIHacks::::_extract_colinfo_of_stable_main_source_order_by_portionDBIx::Class::Storage::DBIHacks::_extract_colinfo_of_stable_main_source_order_by_portion
0000s0sDBIx::Class::Storage::DBIHacks::::_find_join_path_to_nodeDBIx::Class::Storage::DBIHacks::_find_join_path_to_node
0000s0sDBIx::Class::Storage::DBIHacks::::_group_over_selectionDBIx::Class::Storage::DBIHacks::_group_over_selection
0000s0sDBIx::Class::Storage::DBIHacks::::_inner_join_to_nodeDBIx::Class::Storage::DBIHacks::_inner_join_to_node
0000s0sDBIx::Class::Storage::DBIHacks::::_order_by_is_stableDBIx::Class::Storage::DBIHacks::_order_by_is_stable
0000s0sDBIx::Class::Storage::DBIHacks::::_prune_unused_joinsDBIx::Class::Storage::DBIHacks::_prune_unused_joins
0000s0sDBIx::Class::Storage::DBIHacks::::_resolve_aliastypes_from_select_argsDBIx::Class::Storage::DBIHacks::_resolve_aliastypes_from_select_args
Call graph for these subroutines as a Graphviz dot language file.
Line State
ments
Time
on line
Calls Time
in subs
Code
1package #hide from PAUSE
2 DBIx::Class::Storage::DBIHacks;
3
4#
5# This module contains code that should never have seen the light of day,
6# does not belong in the Storage, or is otherwise unfit for public
7# display. The arrival of SQLA2 should immediately obsolete 90% of this
8#
9
10235µs228µs
# spent 21µs (15+6) within DBIx::Class::Storage::DBIHacks::BEGIN@10 which was called: # once (15µs+6µs) by base::import at line 10
use strict;
# spent 21µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@10 # spent 6µs making 1 call to strict::import
11236µs215µs
# spent 11µs (8+4) within DBIx::Class::Storage::DBIHacks::BEGIN@11 which was called: # once (8µs+4µs) by base::import at line 11
use warnings;
# spent 11µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@11 # spent 4µs making 1 call to warnings::import
12
132170µs225.8ms
# spent 25.8ms (11µs+25.8) within DBIx::Class::Storage::DBIHacks::BEGIN@13 which was called: # once (11µs+25.8ms) by base::import at line 13
use base 'DBIx::Class::Storage';
# spent 25.8ms making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@13 # spent 25.8ms making 1 call to base::import, recursion: max depth 1, sum of overlapping time 25.8ms
14237µs222µs
# spent 17µs (12+5) within DBIx::Class::Storage::DBIHacks::BEGIN@14 which was called: # once (12µs+5µs) by base::import at line 14
use mro 'c3';
# spent 17µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@14 # spent 5µs making 1 call to mro::import
15
16266µs255µs
# spent 44µs (33+11) within DBIx::Class::Storage::DBIHacks::BEGIN@16 which was called: # once (33µs+11µs) by base::import at line 16
use List::Util 'first';
# spent 44µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@16 # spent 11µs making 1 call to List::Util::import
17247µs241µs
# spent 26µs (10+16) within DBIx::Class::Storage::DBIHacks::BEGIN@17 which was called: # once (10µs+16µs) by base::import at line 17
use Scalar::Util 'blessed';
# spent 26µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@17 # spent 16µs making 1 call to Exporter::import
18248µs244µs
# spent 26µs (8+18) within DBIx::Class::Storage::DBIHacks::BEGIN@18 which was called: # once (8µs+18µs) by base::import at line 18
use DBIx::Class::_Util qw(UNRESOLVABLE_CONDITION serialize);
# spent 26µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@18 # spent 18µs making 1 call to Exporter::import
19247µs250µs
# spent 30µs (11+19) within DBIx::Class::Storage::DBIHacks::BEGIN@19 which was called: # once (11µs+19µs) by base::import at line 19
use SQL::Abstract qw(is_plain_value is_literal_value);
# spent 30µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@19 # spent 19µs making 1 call to Exporter::import
20224µs282µs
# spent 44µs (7+37) within DBIx::Class::Storage::DBIHacks::BEGIN@20 which was called: # once (7µs+37µs) by base::import at line 20
use DBIx::Class::Carp;
# spent 44µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@20 # spent 37µs making 1 call to DBIx::Class::Carp::import
2126.52ms2378µs
# spent 193µs (7+185) within DBIx::Class::Storage::DBIHacks::BEGIN@21 which was called: # once (7µs+185µs) by base::import at line 21
use namespace::clean;
# spent 193µs making 1 call to DBIx::Class::Storage::DBIHacks::BEGIN@21 # spent 186µs making 1 call to namespace::clean::import
22
23#
24# This code will remove non-selecting/non-restricting joins from
25# {from} specs, aiding the RDBMS query optimizer
26#
27sub _prune_unused_joins {
28 my ($self, $attrs) = @_;
29
30 # only standard {from} specs are supported, and we could be disabled in general
31 return ($attrs->{from}, {}) unless (
32 ref $attrs->{from} eq 'ARRAY'
33 and
34 @{$attrs->{from}} > 1
35 and
36 ref $attrs->{from}[0] eq 'HASH'
37 and
38 ref $attrs->{from}[1] eq 'ARRAY'
39 and
40 $self->_use_join_optimizer
41 );
42
43 my $orig_aliastypes = $self->_resolve_aliastypes_from_select_args($attrs);
44
45 my $new_aliastypes = { %$orig_aliastypes };
46
47 # we will be recreating this entirely
48 my @reclassify = 'joining';
49
50 # a grouped set will not be affected by amount of rows. Thus any
51 # purely multiplicator classifications can go
52 # (will be reintroduced below if needed by something else)
53 push @reclassify, qw(multiplying premultiplied)
54 if $attrs->{_force_prune_multiplying_joins} or $attrs->{group_by};
55
56 # nuke what will be recalculated
57 delete @{$new_aliastypes}{@reclassify};
58
59 my @newfrom = $attrs->{from}[0]; # FROM head is always present
60
61 # recalculate what we need once the multipliers are potentially gone
62 # ignore premultiplies, since they do not add any value to anything
63 my %need_joins;
64 for ( @{$new_aliastypes}{grep { $_ ne 'premultiplied' } keys %$new_aliastypes }) {
65 # add all requested aliases
66 $need_joins{$_} = 1 for keys %$_;
67
68 # add all their parents (as per joinpath which is an AoH { table => alias })
69 $need_joins{$_} = 1 for map { values %$_ } map { @{$_->{-parents}} } values %$_;
70 }
71
72 for my $j (@{$attrs->{from}}[1..$#{$attrs->{from}}]) {
73 push @newfrom, $j if (
74 (! defined $j->[0]{-alias}) # legacy crap
75 ||
76 $need_joins{$j->[0]{-alias}}
77 );
78 }
79
80 # we have a new set of joiners - for everything we nuked pull the classification
81 # off the original stack
82 for my $ctype (@reclassify) {
83 $new_aliastypes->{$ctype} = { map
84 { $need_joins{$_} ? ( $_ => $orig_aliastypes->{$ctype}{$_} ) : () }
85 keys %{$orig_aliastypes->{$ctype}}
86 }
87 }
88
89 return ( \@newfrom, $new_aliastypes );
90}
91
92#
93# This is the code producing joined subqueries like:
94# SELECT me.*, other.* FROM ( SELECT me.* FROM ... ) JOIN other ON ...
95#
96sub _adjust_select_args_for_complex_prefetch {
97 my ($self, $attrs) = @_;
98
99 $self->throw_exception ('Complex prefetches are not supported on resultsets with a custom from attribute') unless (
100 ref $attrs->{from} eq 'ARRAY'
101 and
102 @{$attrs->{from}} > 1
103 and
104 ref $attrs->{from}[0] eq 'HASH'
105 and
106 ref $attrs->{from}[1] eq 'ARRAY'
107 );
108
109 my $root_alias = $attrs->{alias};
110
111 # generate inner/outer attribute lists, remove stuff that doesn't apply
112 my $outer_attrs = { %$attrs };
113 delete @{$outer_attrs}{qw(from bind rows offset group_by _grouped_by_distinct having)};
114
115 my $inner_attrs = { %$attrs, _simple_passthrough_construction => 1 };
116 delete @{$inner_attrs}{qw(for collapse select as)};
117
118 # there is no point of ordering the insides if there is no limit
119 delete $inner_attrs->{order_by} if (
120 delete $inner_attrs->{_order_is_artificial}
121 or
122 ! $inner_attrs->{rows}
123 );
124
125 # generate the inner/outer select lists
126 # for inside we consider only stuff *not* brought in by the prefetch
127 # on the outside we substitute any function for its alias
128 $outer_attrs->{select} = [ @{$attrs->{select}} ];
129
130 my ($root_node, $root_node_offset);
131
132 for my $i (0 .. $#{$inner_attrs->{from}}) {
133 my $node = $inner_attrs->{from}[$i];
134 my $h = (ref $node eq 'HASH') ? $node
135 : (ref $node eq 'ARRAY' and ref $node->[0] eq 'HASH') ? $node->[0]
136 : next
137 ;
138
139 if ( ($h->{-alias}||'') eq $root_alias and $h->{-rsrc} ) {
140 $root_node = $h;
141 $root_node_offset = $i;
142 last;
143 }
144 }
145
146 $self->throw_exception ('Complex prefetches are not supported on resultsets with a custom from attribute')
147 unless $root_node;
148
149 # use the heavy duty resolver to take care of aliased/nonaliased naming
150 my $colinfo = $self->_resolve_column_info($inner_attrs->{from});
151 my $selected_root_columns;
152
153 for my $i (0 .. $#{$outer_attrs->{select}}) {
154 my $sel = $outer_attrs->{select}->[$i];
155
156 next if (
157 $colinfo->{$sel} and $colinfo->{$sel}{-source_alias} ne $root_alias
158 );
159
160 if (ref $sel eq 'HASH' ) {
161 $sel->{-as} ||= $attrs->{as}[$i];
162 $outer_attrs->{select}->[$i] = join ('.', $root_alias, ($sel->{-as} || "inner_column_$i") );
163 }
164 elsif (! ref $sel and my $ci = $colinfo->{$sel}) {
165 $selected_root_columns->{$ci->{-colname}} = 1;
166 }
167
168 push @{$inner_attrs->{select}}, $sel;
169
170 push @{$inner_attrs->{as}}, $attrs->{as}[$i];
171 }
172
173 # We will need to fetch all native columns in the inner subquery, which may
174 # be a part of an *outer* join condition, or an order_by (which needs to be
175 # preserved outside), or wheres. In other words everything but the inner
176 # selector
177 # We can not just fetch everything because a potential has_many restricting
178 # join collapse *will not work* on heavy data types.
179 my $connecting_aliastypes = $self->_resolve_aliastypes_from_select_args({
180 %$inner_attrs,
181 select => [],
182 });
183
184 for (sort map { keys %{$_->{-seen_columns}||{}} } map { values %$_ } values %$connecting_aliastypes) {
185 my $ci = $colinfo->{$_} or next;
186 if (
187 $ci->{-source_alias} eq $root_alias
188 and
189 ! $selected_root_columns->{$ci->{-colname}}++
190 ) {
191 # adding it to both to keep limits not supporting dark selectors happy
192 push @{$inner_attrs->{select}}, $ci->{-fq_colname};
193 push @{$inner_attrs->{as}}, $ci->{-fq_colname};
194 }
195 }
196
197 # construct the inner {from} and lock it in a subquery
198 # we need to prune first, because this will determine if we need a group_by below
199 # throw away all non-selecting, non-restricting multijoins
200 # (since we def. do not care about multiplication of the contents of the subquery)
201 my $inner_subq = do {
202
203 # must use it here regardless of user requests (vastly gentler on optimizer)
204 local $self->{_use_join_optimizer} = 1;
205
206 # throw away multijoins since we def. do not care about those inside the subquery
207 ($inner_attrs->{from}, my $inner_aliastypes) = $self->_prune_unused_joins ({
208 %$inner_attrs, _force_prune_multiplying_joins => 1
209 });
210
211 # uh-oh a multiplier (which is not us) left in, this is a problem for limits
212 # we will need to add a group_by to collapse the resultset for proper counts
213 if (
214 grep { $_ ne $root_alias } keys %{ $inner_aliastypes->{multiplying} || {} }
215 and
216 ( ! $inner_aliastypes->{grouping} or $inner_attrs->{_grouped_by_distinct} )
217
218 ) {
219
220 my $cur_sel = { map { $_ => 1 } @{$inner_attrs->{select}} };
221
222 # *possibly* supplement the main selection with pks if not already
223 # there, as they will have to be a part of the group_by to collapse
224 # things properly
225 my $inner_select_with_extras;
226 my @pks = map { "$root_alias.$_" } $root_node->{-rsrc}->primary_columns
227 or $self->throw_exception( sprintf
228 'Unable to perform complex limited prefetch off %s without declared primary key',
229 $root_node->{-rsrc}->source_name,
230 );
231 for my $col (@pks) {
232 push @{ $inner_select_with_extras ||= [ @{$inner_attrs->{select}} ] }, $col
233 unless $cur_sel->{$col}++;
234 }
235
236 ($inner_attrs->{group_by}, $inner_attrs->{order_by}) = $self->_group_over_selection({
237 %$inner_attrs,
238 $inner_select_with_extras ? ( select => $inner_select_with_extras ) : (),
239 _aliastypes => $inner_aliastypes,
240 });
241 }
242
243 # we already optimized $inner_attrs->{from} above
244 # and already local()ized
245 $self->{_use_join_optimizer} = 0;
246
247 # generate the subquery
248 $self->_select_args_to_query (
249 @{$inner_attrs}{qw(from select where)},
250 $inner_attrs,
251 );
252 };
253
254 # Generate the outer from - this is relatively easy (really just replace
255 # the join slot with the subquery), with a major caveat - we can not
256 # join anything that is non-selecting (not part of the prefetch), but at
257 # the same time is a multi-type relationship, as it will explode the result.
258 #
259 # There are two possibilities here
260 # - either the join is non-restricting, in which case we simply throw it away
261 # - it is part of the restrictions, in which case we need to collapse the outer
262 # result by tackling yet another group_by to the outside of the query
263
264 # work on a shallow copy
265 my @orig_from = @{$attrs->{from}};
266
267
268 $outer_attrs->{from} = \ my @outer_from;
269
270 # we may not be the head
271 if ($root_node_offset) {
272 # first generate the outer_from, up to the substitution point
273 @outer_from = splice @orig_from, 0, $root_node_offset;
274
275 # substitute the subq at the right spot
276 push @outer_from, [
277 {
278 -alias => $root_alias,
279 -rsrc => $root_node->{-rsrc},
280 $root_alias => $inner_subq,
281 },
282 # preserve attrs from what is now the head of the from after the splice
283 @{$orig_from[0]}[1 .. $#{$orig_from[0]}],
284 ];
285 }
286 else {
287 @outer_from = {
288 -alias => $root_alias,
289 -rsrc => $root_node->{-rsrc},
290 $root_alias => $inner_subq,
291 };
292 }
293
294 shift @orig_from; # what we just replaced above
295
296 # scan the *remaining* from spec against different attributes, and see which joins are needed
297 # in what role
298 my $outer_aliastypes = $outer_attrs->{_aliastypes} =
299 $self->_resolve_aliastypes_from_select_args({ %$outer_attrs, from => \@orig_from });
300
301 # unroll parents
302 my ($outer_select_chain, @outer_nonselecting_chains) = map { +{
303 map { $_ => 1 } map { values %$_} map { @{$_->{-parents}} } values %{ $outer_aliastypes->{$_} || {} }
304 } } qw/selecting restricting grouping ordering/;
305
306 # see what's left - throw away if not selecting/restricting
307 my $may_need_outer_group_by;
308 while (my $j = shift @orig_from) {
309 my $alias = $j->[0]{-alias};
310
311 if (
312 $outer_select_chain->{$alias}
313 ) {
314 push @outer_from, $j
315 }
316 elsif (first { $_->{$alias} } @outer_nonselecting_chains ) {
317 push @outer_from, $j;
318 $may_need_outer_group_by ||= $outer_aliastypes->{multiplying}{$alias} ? 1 : 0;
319 }
320 }
321
322 # also throw in a synthetic group_by if a non-selecting multiplier,
323 # to guard against cross-join explosions
324 # the logic is somewhat fragile, but relies on the idea that if a user supplied
325 # a group by on their own - they know what they were doing
326 if ( $may_need_outer_group_by and $attrs->{_grouped_by_distinct} ) {
327 ($outer_attrs->{group_by}, $outer_attrs->{order_by}) = $self->_group_over_selection ({
328 %$outer_attrs,
329 from => \@outer_from,
330 });
331 }
332
333 # This is totally horrific - the {where} ends up in both the inner and outer query
334 # Unfortunately not much can be done until SQLA2 introspection arrives, and even
335 # then if where conditions apply to the *right* side of the prefetch, you may have
336 # to both filter the inner select (e.g. to apply a limit) and then have to re-filter
337 # the outer select to exclude joins you didn't want in the first place
338 #
339 # OTOH it can be seen as a plus: <ash> (notes that this query would make a DBA cry ;)
340 return $outer_attrs;
341}
342
343#
344# I KNOW THIS SUCKS! GET SQLA2 OUT THE DOOR SO THIS CAN DIE!
345#
346# Due to a lack of SQLA2 we fall back to crude scans of all the
347# select/where/order/group attributes, in order to determine what
348# aliases are needed to fulfill the query. This information is used
349# throughout the code to prune unnecessary JOINs from the queries
350# in an attempt to reduce the execution time.
351# Although the method is pretty horrific, the worst thing that can
352# happen is for it to fail due to some scalar SQL, which in turn will
353# result in a vocal exception.
354sub _resolve_aliastypes_from_select_args {
355 my ( $self, $attrs ) = @_;
356
357 $self->throw_exception ('Unable to analyze custom {from}')
358 if ref $attrs->{from} ne 'ARRAY';
359
360 # what we will return
361 my $aliases_by_type;
362
363 # see what aliases are there to work with
364 # and record who is a multiplier and who is premultiplied
365 my $alias_list;
366 for my $node (@{$attrs->{from}}) {
367
368 my $j = $node;
369 $j = $j->[0] if ref $j eq 'ARRAY';
370 my $al = $j->{-alias}
371 or next;
372
373 $alias_list->{$al} = $j;
374
375 $aliases_by_type->{multiplying}{$al} ||= { -parents => $j->{-join_path}||[] }
376 # not array == {from} head == can't be multiplying
377 if ref($node) eq 'ARRAY' and ! $j->{-is_single};
378
379 $aliases_by_type->{premultiplied}{$al} ||= { -parents => $j->{-join_path}||[] }
380 # parts of the path that are not us but are multiplying
381 if grep { $aliases_by_type->{multiplying}{$_} }
382 grep { $_ ne $al }
383 map { values %$_ }
384 @{ $j->{-join_path}||[] }
385 }
386
387 # get a column to source/alias map (including unambiguous unqualified ones)
388 my $colinfo = $self->_resolve_column_info ($attrs->{from});
389
390 # set up a botched SQLA
391 my $sql_maker = $self->sql_maker;
392
393 # these are throw away results, do not pollute the bind stack
394 local $sql_maker->{where_bind};
395 local $sql_maker->{group_bind};
396 local $sql_maker->{having_bind};
397 local $sql_maker->{from_bind};
398
399 # we can't scan properly without any quoting (\b doesn't cut it
400 # everywhere), so unless there is proper quoting set - use our
401 # own weird impossible character.
402 # Also in the case of no quoting, we need to explicitly disable
403 # name_sep, otherwise sorry nasty legacy syntax like
404 # { 'count(foo.id)' => { '>' => 3 } } will stop working >:(
405 local $sql_maker->{quote_char} = $sql_maker->{quote_char};
406 local $sql_maker->{name_sep} = $sql_maker->{name_sep};
407
408 unless (defined $sql_maker->{quote_char} and length $sql_maker->{quote_char}) {
409 $sql_maker->{quote_char} = ["\x00", "\xFF"];
410 # if we don't unset it we screw up retarded but unfortunately working
411 # 'MAX(foo.bar)' => { '>', 3 }
412 $sql_maker->{name_sep} = '';
413 }
414
415 my ($lquote, $rquote, $sep) = map { quotemeta $_ } ($sql_maker->_quote_chars, $sql_maker->name_sep);
416
417 # generate sql chunks
418 my $to_scan = {
419 restricting => [
420 ($sql_maker->_recurse_where ($attrs->{where}))[0],
421 $sql_maker->_parse_rs_attrs ({ having => $attrs->{having} }),
422 ],
423 grouping => [
424 $sql_maker->_parse_rs_attrs ({ group_by => $attrs->{group_by} }),
425 ],
426 joining => [
427 $sql_maker->_recurse_from (
428 ref $attrs->{from}[0] eq 'ARRAY' ? $attrs->{from}[0][0] : $attrs->{from}[0],
429 @{$attrs->{from}}[1 .. $#{$attrs->{from}}],
430 ),
431 ],
432 selecting => [
433 map { ($sql_maker->_recurse_fields($_))[0] } @{$attrs->{select}},
434 ],
435 ordering => [
436 map { $_->[0] } $self->_extract_order_criteria ($attrs->{order_by}, $sql_maker),
437 ],
438 };
439
440 # throw away empty chunks and all 2-value arrayrefs: the thinking is that these are
441 # bind value specs left in by the sloppy renderer above. It is ok to do this
442 # at this point, since we are going to end up rewriting this crap anyway
443 for my $v (values %$to_scan) {
444 my @nv;
445 for (@$v) {
446 next if (
447 ! defined $_
448 or
449 (
450 ref $_ eq 'ARRAY'
451 and
452 ( @$_ == 0 or @$_ == 2 )
453 )
454 );
455
456 if (ref $_) {
457 require Data::Dumper::Concise;
458 $self->throw_exception("Unexpected ref in scan-plan: " . Data::Dumper::Concise::Dumper($v) );
459 }
460
461 push @nv, $_;
462 }
463
464 $v = \@nv;
465 }
466
467 # kill all selectors which look like a proper subquery
468 # this is a sucky heuristic *BUT* - if we get it wrong the query will simply
469 # fail to run, so we are relatively safe
470 $to_scan->{selecting} = [ grep {
471 $_ !~ / \A \s* \( \s* SELECT \s+ .+? \s+ FROM \s+ .+? \) \s* \z /xsi
472 } @{ $to_scan->{selecting} || [] } ];
473
474 # first see if we have any exact matches (qualified or unqualified)
475 for my $type (keys %$to_scan) {
476 for my $piece (@{$to_scan->{$type}}) {
477 if ($colinfo->{$piece} and my $alias = $colinfo->{$piece}{-source_alias}) {
478 $aliases_by_type->{$type}{$alias} ||= { -parents => $alias_list->{$alias}{-join_path}||[] };
479 $aliases_by_type->{$type}{$alias}{-seen_columns}{$colinfo->{$piece}{-fq_colname}} = $piece;
480 }
481 }
482 }
483
484 # now loop through all fully qualified columns and get the corresponding
485 # alias (should work even if they are in scalarrefs)
486 for my $alias (keys %$alias_list) {
487 my $al_re = qr/
488 $lquote $alias $rquote $sep (?: $lquote ([^$rquote]+) $rquote )?
489 |
490 \b $alias \. ([^\s\)\($rquote]+)?
491 /x;
492
493 for my $type (keys %$to_scan) {
494 for my $piece (@{$to_scan->{$type}}) {
495 if (my @matches = $piece =~ /$al_re/g) {
496 $aliases_by_type->{$type}{$alias} ||= { -parents => $alias_list->{$alias}{-join_path}||[] };
497 $aliases_by_type->{$type}{$alias}{-seen_columns}{"$alias.$_"} = "$alias.$_"
498 for grep { defined $_ } @matches;
499 }
500 }
501 }
502 }
503
504 # now loop through unqualified column names, and try to locate them within
505 # the chunks
506 for my $col (keys %$colinfo) {
507 next if $col =~ / \. /x; # if column is qualified it was caught by the above
508
509 my $col_re = qr/ $lquote ($col) $rquote /x;
510
511 for my $type (keys %$to_scan) {
512 for my $piece (@{$to_scan->{$type}}) {
513 if ( my @matches = $piece =~ /$col_re/g) {
514 my $alias = $colinfo->{$col}{-source_alias};
515 $aliases_by_type->{$type}{$alias} ||= { -parents => $alias_list->{$alias}{-join_path}||[] };
516 $aliases_by_type->{$type}{$alias}{-seen_columns}{"$alias.$_"} = $_
517 for grep { defined $_ } @matches;
518 }
519 }
520 }
521 }
522
523 # Add any non-left joins to the restriction list (such joins are indeed restrictions)
524 for my $j (values %$alias_list) {
525 my $alias = $j->{-alias} or next;
526 $aliases_by_type->{restricting}{$alias} ||= { -parents => $j->{-join_path}||[] } if (
527 (not $j->{-join_type})
528 or
529 ($j->{-join_type} !~ /^left (?: \s+ outer)? $/xi)
530 );
531 }
532
533 for (keys %$aliases_by_type) {
534 delete $aliases_by_type->{$_} unless keys %{$aliases_by_type->{$_}};
535 }
536
537 return $aliases_by_type;
538}
539
540# This is the engine behind { distinct => 1 } and the general
541# complex prefetch grouper
542sub _group_over_selection {
543 my ($self, $attrs) = @_;
544
545 my $colinfos = $self->_resolve_column_info ($attrs->{from});
546
547 my (@group_by, %group_index);
548
549 # the logic is: if it is a { func => val } we assume an aggregate,
550 # otherwise if \'...' or \[...] we assume the user knows what is
551 # going on thus group over it
552 for (@{$attrs->{select}}) {
553 if (! ref($_) or ref ($_) ne 'HASH' ) {
554 push @group_by, $_;
555 $group_index{$_}++;
556 if ($colinfos->{$_} and $_ !~ /\./ ) {
557 # add a fully qualified version as well
558 $group_index{"$colinfos->{$_}{-source_alias}.$_"}++;
559 }
560 }
561 }
562
563 my @order_by = $self->_extract_order_criteria($attrs->{order_by})
564 or return (\@group_by, $attrs->{order_by});
565
566 # add any order_by parts that are not already present in the group_by
567 # to maintain SQL cross-compatibility and general sanity
568 #
569 # also in case the original selection is *not* unique, or in case part
570 # of the ORDER BY refers to a multiplier - we will need to replace the
571 # skipped order_by elements with their MIN/MAX equivalents as to maintain
572 # the proper overall order without polluting the group criteria (and
573 # possibly changing the outcome entirely)
574
575 my ($leftovers, $sql_maker, @new_order_by, $order_chunks, $aliastypes);
576
577 my $group_already_unique = $self->_columns_comprise_identifying_set($colinfos, \@group_by);
578
579 for my $o_idx (0 .. $#order_by) {
580
581 # if the chunk is already a min/max function - there is nothing left to touch
582 next if $order_by[$o_idx][0] =~ /^ (?: min | max ) \s* \( .+ \) $/ix;
583
584 # only consider real columns (for functions the user got to do an explicit group_by)
585 my $chunk_ci;
586 if (
587 @{$order_by[$o_idx]} != 1
588 or
589 ( ! ( $chunk_ci = $colinfos->{$order_by[$o_idx][0]} ) and $attrs->{_aliastypes} )
590
- -
593 ) {
594 push @$leftovers, $order_by[$o_idx][0];
595 }
596
597 next unless $chunk_ci;
598
599 # no duplication of group criteria
600 next if $group_index{$chunk_ci->{-fq_colname}};
601
602 $aliastypes ||= (
603 $attrs->{_aliastypes}
604 or
605 $self->_resolve_aliastypes_from_select_args({
606 from => $attrs->{from},
607 order_by => $attrs->{order_by},
608 })
609 ) if $group_already_unique;
610
611 # check that we are not ordering by a multiplier (if a check is requested at all)
612 if (
613 $group_already_unique
614 and
615 ! $aliastypes->{multiplying}{$chunk_ci->{-source_alias}}
616 and
617 ! $aliastypes->{premultiplied}{$chunk_ci->{-source_alias}}
618 ) {
619 push @group_by, $chunk_ci->{-fq_colname};
620 $group_index{$chunk_ci->{-fq_colname}}++
621 }
622 else {
623 # We need to order by external columns without adding them to the group
624 # (eiehter a non-unique selection, or a multi-external)
625 #
626 # This doesn't really make sense in SQL, however from DBICs point
627 # of view is rather valid (e.g. order the leftmost objects by whatever
628 # criteria and get the offset/rows many). There is a way around
629 # this however in SQL - we simply tae the direction of each piece
630 # of the external order and convert them to MIN(X) for ASC or MAX(X)
631 # for DESC, and group_by the root columns. The end result should be
632 # exactly what we expect
633
634 # FIXME - this code is a joke, will need to be completely rewritten in
635 # the DQ branch. But I need to push a POC here, otherwise the
636 # pesky tests won't pass
637 # wrap any part of the order_by that "responds" to an ordering alias
638 # into a MIN/MAX
639 $sql_maker ||= $self->sql_maker;
640 $order_chunks ||= [
641 map { ref $_ eq 'ARRAY' ? $_ : [ $_ ] } $sql_maker->_order_by_chunks($attrs->{order_by})
642 ];
643
644 my ($chunk, $is_desc) = $sql_maker->_split_order_chunk($order_chunks->[$o_idx][0]);
645
646 $new_order_by[$o_idx] = \[
647 sprintf( '%s( %s )%s',
648 ($is_desc ? 'MAX' : 'MIN'),
649 $chunk,
650 ($is_desc ? ' DESC' : ''),
651 ),
652 @ {$order_chunks->[$o_idx]} [ 1 .. $#{$order_chunks->[$o_idx]} ]
653 ];
654 }
655 }
656
657 $self->throw_exception ( sprintf
658 'Unable to programatically derive a required group_by from the supplied '
659 . 'order_by criteria. To proceed either add an explicit group_by, or '
660 . 'simplify your order_by to only include plain columns '
661 . '(supplied order_by: %s)',
662 join ', ', map { "'$_'" } @$leftovers,
663 ) if $leftovers;
664
665 # recreate the untouched order parts
666 if (@new_order_by) {
667 $new_order_by[$_] ||= \ $order_chunks->[$_] for ( 0 .. $#$order_chunks );
668 }
669
670 return (
671 \@group_by,
672 (@new_order_by ? \@new_order_by : $attrs->{order_by} ), # same ref as original == unchanged
673 );
674}
675
676
# spent 1.05ms (986µs+60µs) within DBIx::Class::Storage::DBIHacks::_resolve_ident_sources which was called 87 times, avg 12µs/call: # 87 times (986µs+60µs) by DBIx::Class::Storage::DBIHacks::_resolve_column_info at line 718, avg 12µs/call
sub _resolve_ident_sources {
6778729µs my ($self, $ident) = @_;
678
6798743µs my $alias2source = {};
680
681 # the reason this is so contrived is that $ident may be a {from}
682 # structure, specifying multiple tables to join
68387352µs8760µs if ( blessed $ident && $ident->isa("DBIx::Class::ResultSource") ) {
# spent 60µs making 87 calls to Scalar::Util::blessed, avg 692ns/call
684 # this is compat mode for insert/update/delete which do not deal with aliases
685 $alias2source->{me} = $ident;
686 }
687 elsif (ref $ident eq 'ARRAY') {
688
6898769µs for (@$ident) {
6908726µs my $tabinfo;
6918763µs if (ref $_ eq 'HASH') {
692 $tabinfo = $_;
693 }
6948729µs if (ref $_ eq 'ARRAY' and ref $_->[0] eq 'HASH') {
695 $tabinfo = $_->[0];
696 }
697
69887264µs $alias2source->{$tabinfo->{-alias}} = $tabinfo->{-rsrc}
699 if ($tabinfo->{-rsrc});
700 }
701 }
702
70387226µs return $alias2source;
704}
705
706# Takes $ident, \@column_names
707#
708# returns { $column_name => \%column_info, ... }
709# also note: this adds -result_source => $rsrc to the column info
710#
711# If no columns_names are supplied returns info about *all* columns
712# for all sources
713
# spent 24.0ms (19.2+4.81) within DBIx::Class::Storage::DBIHacks::_resolve_column_info which was called 87 times, avg 276µs/call: # 87 times (19.2ms+4.81ms) by DBIx::Class::Storage::DBI::__ANON__[/usr/share/perl5/DBIx/Class/Storage/DBI.pm:1698] at line 1684 of DBIx/Class/Storage/DBI.pm, avg 276µs/call
sub _resolve_column_info {
7148742µs my ($self, $ident, $colnames) = @_;
715
7168727µs return {} if $colnames and ! @$colnames;
717
71887230µs871.05ms my $alias2src = $self->_resolve_ident_sources($ident);
# spent 1.05ms making 87 calls to DBIx::Class::Storage::DBIHacks::_resolve_ident_sources, avg 12µs/call
719
7208717µs my (%seen_cols, @auto_colnames);
721
722 # compile a global list of column names, to be able to properly
723 # disambiguate unqualified column names (if at all possible)
72487158µs for my $alias (keys %$alias2src) {
7258749µs my $rsrc = $alias2src->{$alias};
72687291µs87272µs for my $colname ($rsrc->columns) {
# spent 272µs making 87 calls to DBIx::Class::ResultSource::columns, avg 3µs/call
727435318µs push @{$seen_cols{$colname}}, $alias;
728435334µs push @auto_colnames, "$alias.$colname" unless $colnames;
729 }
730 }
731
732 $colnames ||= [
733 @auto_colnames,
73487345µs grep { @{$seen_cols{$_}} == 1 } (keys %seen_cols),
735 ];
736
7378727µs my (%return, $colinfos);
7388795µs foreach my $col (@$colnames) {
73987012.8ms8701.79ms my ($source_alias, $colname) = $col =~ m/^ (?: ([^\.]+) \. )? (.+) $/x;
# spent 1.79ms making 870 calls to DBIx::Class::Storage::DBIHacks::CORE:match, avg 2µs/call
740
741 # if the column was seen exactly once - we know which rsrc it came from
742 $source_alias ||= $seen_cols{$colname}[0]
743870595µs if ($seen_cols{$colname} and @{$seen_cols{$colname}} == 1);
744
74587072µs next unless $source_alias;
746
747870276µs my $rsrc = $alias2src->{$source_alias}
748 or next;
749
750 $return{$col} = {
751 %{
7528703.64ms871.71ms ( $colinfos->{$source_alias} ||= $rsrc->columns_info )->{$colname}
# spent 1.71ms making 87 calls to DBIx::Class::ResultSource::columns_info, avg 20µs/call
753 ||
754 $self->throw_exception(
755 "No such column '$colname' on source " . $rsrc->source_name
756 );
757 },
758 -result_source => $rsrc,
759 -source_alias => $source_alias,
760 -fq_colname => $col eq $colname ? "$source_alias.$col" : $col,
761 -colname => $colname,
762 };
763
764870981µs $return{"$source_alias.$colname"} = $return{$col} if $col eq $colname;
765 }
766
76787525µs return \%return;
768}
769
770# The DBIC relationship chaining implementation is pretty simple - every
771# new related_relationship is pushed onto the {from} stack, and the {select}
772# window simply slides further in. This means that when we count somewhere
773# in the middle, we got to make sure that everything in the join chain is an
774# actual inner join, otherwise the count will come back with unpredictable
775# results (a resultset may be generated with _some_ rows regardless of if
776# the relation which the $rs currently selects has rows or not). E.g.
777# $artist_rs->cds->count - normally generates:
778# SELECT COUNT( * ) FROM artist me LEFT JOIN cd cds ON cds.artist = me.artistid
779# which actually returns the number of artists * (number of cds || 1)
780#
781# So what we do here is crawl {from}, determine if the current alias is at
782# the top of the stack, and if not - make sure the chain is inner-joined down
783# to the root.
784#
785sub _inner_join_to_node {
786 my ($self, $from, $alias) = @_;
787
788 my $switch_branch = $self->_find_join_path_to_node($from, $alias);
789
790 return $from unless @{$switch_branch||[]};
791
792 # So it looks like we will have to switch some stuff around.
793 # local() is useless here as we will be leaving the scope
794 # anyway, and deep cloning is just too fucking expensive
795 # So replace the first hashref in the node arrayref manually
796 my @new_from = ($from->[0]);
797 my $sw_idx = { map { (values %$_), 1 } @$switch_branch }; #there's one k/v per join-path
798
799 for my $j (@{$from}[1 .. $#$from]) {
800 my $jalias = $j->[0]{-alias};
801
802 if ($sw_idx->{$jalias}) {
803 my %attrs = %{$j->[0]};
804 delete $attrs{-join_type};
805 push @new_from, [
806 \%attrs,
807 @{$j}[ 1 .. $#$j ],
808 ];
809 }
810 else {
811 push @new_from, $j;
812 }
813 }
814
815 return \@new_from;
816}
817
818sub _find_join_path_to_node {
819 my ($self, $from, $target_alias) = @_;
820
821 # subqueries and other oddness are naturally not supported
822 return undef if (
823 ref $from ne 'ARRAY'
824 ||
825 ref $from->[0] ne 'HASH'
826 ||
827 ! defined $from->[0]{-alias}
828 );
829
830 # no path - the head is the alias
831 return [] if $from->[0]{-alias} eq $target_alias;
832
833 for my $i (1 .. $#$from) {
834 return $from->[$i][0]{-join_path} if ( ($from->[$i][0]{-alias}||'') eq $target_alias );
835 }
836
837 # something else went quite wrong
838 return undef;
839}
840
841
# spent 481µs (35+446) within DBIx::Class::Storage::DBIHacks::_extract_order_criteria which was called: # once (35µs+446µs) by DBIx::Class::ResultSetColumn::new at line 77 of DBIx/Class/ResultSetColumn.pm
sub _extract_order_criteria {
84211µs my ($self, $order_by, $sql_maker) = @_;
843
844
# spent 391µs (54+336) within DBIx::Class::Storage::DBIHacks::__ANON__[/usr/share/perl5/DBIx/Class/Storage/DBIHacks.pm:869] which was called: # once (54µs+336µs) by DBIx::Class::Storage::DBIHacks::_extract_order_criteria at line 882
my $parser = sub {
84511µs my ($sql_maker, $order_by, $orig_quote_chars) = @_;
846
8471300ns return scalar $sql_maker->_order_by_chunks ($order_by)
848 unless wantarray;
849
85015µs1273µs my ($lq, $rq, $sep) = map { quotemeta($_) } (
# spent 273µs making 1 call to DBIx::Class::SQLMaker::name_sep
851 ($orig_quote_chars ? @$orig_quote_chars : $sql_maker->_quote_chars),
852 $sql_maker->name_sep
853 );
854
8551500ns my @chunks;
856121µs163µs for ($sql_maker->_order_by_chunks ($order_by) ) {
# spent 63µs making 1 call to SQL::Abstract::_order_by_chunks
857 my $chunk = ref $_ ? [ @$_ ] : [ $_ ];
858 ($chunk->[0]) = $sql_maker->_split_order_chunk($chunk->[0]);
859
860 # order criteria may have come back pre-quoted (literals and whatnot)
861 # this is fragile, but the best we can currently do
862 $chunk->[0] =~ s/^ $lq (.+?) $rq $sep $lq (.+?) $rq $/"$1.$2"/xe
863 or $chunk->[0] =~ s/^ $lq (.+) $rq $/$1/x;
864
865 push @chunks, $chunk;
866 }
867
86815µs return @chunks;
86915µs };
870
8711300ns if ($sql_maker) {
872 return $parser->($sql_maker, $order_by);
873 }
874 else {
87512µs150µs $sql_maker = $self->sql_maker;
# spent 50µs making 1 call to DBIx::Class::Storage::DBI::mysql::sql_maker
876
877 # pass these in to deal with literals coming from
878 # the user or the deep guts of prefetch
87917µs16µs my $orig_quote_chars = [$sql_maker->_quote_chars];
# spent 6µs making 1 call to DBIx::Class::SQLMaker::_quote_chars
880
88111µs local $sql_maker->{quote_char};
882112µs1391µs return $parser->($sql_maker, $order_by, $orig_quote_chars);
883 }
884}
885
886sub _order_by_is_stable {
887 my ($self, $ident, $order_by, $where) = @_;
888
889 my @cols = (
890 ( map { $_->[0] } $self->_extract_order_criteria($order_by) ),
891 ( $where ? keys %{ $self->_extract_fixed_condition_columns($where) } : () ),
892 ) or return 0;
893
894 my $colinfo = $self->_resolve_column_info($ident, \@cols);
895
896 return keys %$colinfo
897 ? $self->_columns_comprise_identifying_set( $colinfo, \@cols )
898 : 0
899 ;
900}
901
902sub _columns_comprise_identifying_set {
903 my ($self, $colinfo, $columns) = @_;
904
905 my $cols_per_src;
906 $cols_per_src -> {$_->{-source_alias}} -> {$_->{-colname}} = $_
907 for grep { defined $_ } @{$colinfo}{@$columns};
908
909 for (values %$cols_per_src) {
910 my $src = (values %$_)[0]->{-result_source};
911 return 1 if $src->_identifying_column_set($_);
912 }
913
914 return 0;
915}
916
917# this is almost similar to _order_by_is_stable, except it takes
918# a single rsrc, and will succeed only if the first portion of the order
919# by is stable.
920# returns that portion as a colinfo hashref on success
921sub _extract_colinfo_of_stable_main_source_order_by_portion {
922 my ($self, $attrs) = @_;
923
924 my $nodes = $self->_find_join_path_to_node($attrs->{from}, $attrs->{alias});
925
926 return unless defined $nodes;
927
928 my @ord_cols = map
929 { $_->[0] }
930 ( $self->_extract_order_criteria($attrs->{order_by}) )
931 ;
932 return unless @ord_cols;
933
934 my $valid_aliases = { map { $_ => 1 } (
935 $attrs->{from}[0]{-alias},
936 map { values %$_ } @$nodes,
937 ) };
938
939 my $colinfos = $self->_resolve_column_info($attrs->{from});
940
941 my ($colinfos_to_return, $seen_main_src_cols);
942
943 for my $col (@ord_cols) {
944 # if order criteria is unresolvable - there is nothing we can do
945 my $colinfo = $colinfos->{$col} or last;
946
947 # if we reached the end of the allowed aliases - also nothing we can do
948 last unless $valid_aliases->{$colinfo->{-source_alias}};
949
950 $colinfos_to_return->{$col} = $colinfo;
951
952 $seen_main_src_cols->{$colinfo->{-colname}} = 1
953 if $colinfo->{-source_alias} eq $attrs->{alias};
954 }
955
956 # FIXME the condition may be singling out things on its own, so we
957 # conceivable could come back wi "stable-ordered by nothing"
958 # not confient enough in the parser yet, so punt for the time being
959 return unless $seen_main_src_cols;
960
961 my $main_src_fixed_cols_from_cond = [ $attrs->{where}
962 ? (
963 map
964 {
965 ( $colinfos->{$_} and $colinfos->{$_}{-source_alias} eq $attrs->{alias} )
966 ? $colinfos->{$_}{-colname}
967 : ()
968 }
969 keys %{ $self->_extract_fixed_condition_columns($attrs->{where}) }
970 )
971 : ()
972 ];
973
974 return $attrs->{result_source}->_identifying_column_set([
975 keys %$seen_main_src_cols,
976 @$main_src_fixed_cols_from_cond,
977 ]) ? $colinfos_to_return : ();
978}
979
980# Attempts to flatten a passed in SQLA condition as much as possible towards
981# a plain hashref, *without* altering its semantics. Required by
982# create/populate being able to extract definitive conditions from preexisting
983# resultset {where} stacks
984#
985# FIXME - while relatively robust, this is still imperfect, one of the first
986# things to tackle with DQ
987
# spent 4.73ms (3.20+1.53) within DBIx::Class::Storage::DBIHacks::_collapse_cond which was called 87 times, avg 54µs/call: # 87 times (3.20ms+1.53ms) by DBIx::Class::Storage::DBIHacks::_extract_fixed_condition_columns at line 1342, avg 54µs/call
sub _collapse_cond {
9888738µs my ($self, $where, $where_is_anded_array) = @_;
989
9908726µs my $fin;
991
99287159µs if (! $where) {
993 return;
994 }
995 elsif ($where_is_anded_array or ref $where eq 'HASH') {
996
9978719µs my @pairs;
998
9998769µs my @pieces = $where_is_anded_array ? @$where : $where;
10008747µs while (@pieces) {
10018730µs my $chunk = shift @pieces;
1002
100387128µs if (ref $chunk eq 'HASH') {
100487476µs8784µs for (sort keys %$chunk) {
# spent 84µs making 87 calls to DBIx::Class::Storage::DBIHacks::CORE:sort, avg 964ns/call
1005
1006 # Match SQLA 1.79 behavior
10078734µs if ($_ eq '') {
1008 is_literal_value($chunk->{$_})
1009 ? carp 'Hash-pairs consisting of an empty string with a literal are deprecated, use -and => [ $literal ] instead'
1010 : $self->throw_exception("Supplying an empty left hand side argument is not supported in hash-pairs")
1011 ;
1012 }
1013
101487136µs push @pairs, $_ => $chunk->{$_};
1015 }
1016 }
1017 elsif (ref $chunk eq 'ARRAY') {
1018 push @pairs, -or => $chunk
1019 if @$chunk;
1020 }
1021 elsif ( ! length ref $chunk) {
1022
1023 # Match SQLA 1.79 behavior
1024 $self->throw_exception("Supplying an empty left hand side argument is not supported in array-pairs")
1025 if $where_is_anded_array and (! defined $chunk or $chunk eq '');
1026
1027 push @pairs, $chunk, shift @pieces;
1028 }
1029 else {
1030 push @pairs, '', $chunk;
1031 }
1032 }
1033
10348722µs return unless @pairs;
1035
103687248µs871.28ms my @conds = $self->_collapse_cond_unroll_pairs(\@pairs)
# spent 1.28ms making 87 calls to DBIx::Class::Storage::DBIHacks::_collapse_cond_unroll_pairs, avg 15µs/call
1037 or return;
1038
1039 # Consolidate various @conds back into something more compact
10408787µs for my $c (@conds) {
10418792µs if (ref $c ne 'HASH') {
1042 push @{$fin->{-and}}, $c;
1043 }
1044 else {
104587286µs8720µs for my $col (sort keys %$c) {
# spent 20µs making 87 calls to DBIx::Class::Storage::DBIHacks::CORE:sort, avg 228ns/call
1046
1047 # consolidate all -and nodes
104887472µs174108µs if ($col =~ /^\-and$/i) {
# spent 108µs making 174 calls to DBIx::Class::Storage::DBIHacks::CORE:match, avg 618ns/call
1049 push @{$fin->{-and}},
1050 ref $c->{$col} eq 'ARRAY' ? @{$c->{$col}}
1051 : ref $c->{$col} eq 'HASH' ? %{$c->{$col}}
1052 : { $col => $c->{$col} }
1053 ;
1054 }
1055 elsif ($col =~ /^\-/) {
1056 push @{$fin->{-and}}, { $col => $c->{$col} };
1057 }
1058 elsif (exists $fin->{$col}) {
1059 $fin->{$col} = [ -and => map {
1060 (ref $_ eq 'ARRAY' and ($_->[0]||'') =~ /^\-and$/i )
1061 ? @{$_}[1..$#$_]
1062 : $_
1063 ;
1064 } ($fin->{$col}, $c->{$col}) ];
1065 }
1066 else {
10678790µs $fin->{$col} = $c->{$col};
1068 }
1069 }
1070 }
1071 }
1072 }
1073 elsif (ref $where eq 'ARRAY') {
1074 # we are always at top-level here, it is safe to dump empty *standalone* pieces
1075 my $fin_idx;
1076
1077 for (my $i = 0; $i <= $#$where; $i++ ) {
1078
1079 # Match SQLA 1.79 behavior
1080 $self->throw_exception(
1081 "Supplying an empty left hand side argument is not supported in array-pairs"
1082 ) if (! defined $where->[$i] or ! length $where->[$i]);
1083
1084 my $logic_mod = lc ( ($where->[$i] =~ /^(\-(?:and|or))$/i)[0] || '' );
1085
1086 if ($logic_mod) {
1087 $i++;
1088 $self->throw_exception("Unsupported top-level op/arg pair: [ $logic_mod => $where->[$i] ]")
1089 unless ref $where->[$i] eq 'HASH' or ref $where->[$i] eq 'ARRAY';
1090
1091 my $sub_elt = $self->_collapse_cond({ $logic_mod => $where->[$i] })
1092 or next;
1093
1094 my @keys = keys %$sub_elt;
1095 if ( @keys == 1 and $keys[0] !~ /^\-/ ) {
1096 $fin_idx->{ "COL_$keys[0]_" . serialize $sub_elt } = $sub_elt;
1097 }
1098 else {
1099 $fin_idx->{ "SER_" . serialize $sub_elt } = $sub_elt;
1100 }
1101 }
1102 elsif (! length ref $where->[$i] ) {
1103 my $sub_elt = $self->_collapse_cond({ @{$where}[$i, $i+1] })
1104 or next;
1105
1106 $fin_idx->{ "COL_$where->[$i]_" . serialize $sub_elt } = $sub_elt;
1107 $i++;
1108 }
1109 else {
1110 $fin_idx->{ "SER_" . serialize $where->[$i] } = $self->_collapse_cond( $where->[$i] ) || next;
1111 }
1112 }
1113
1114 if (! $fin_idx) {
1115 return;
1116 }
1117 elsif ( keys %$fin_idx == 1 ) {
1118 $fin = (values %$fin_idx)[0];
1119 }
1120 else {
1121 my @or;
1122
1123 # at this point everything is at most one level deep - unroll if needed
1124 for (sort keys %$fin_idx) {
1125 if ( ref $fin_idx->{$_} eq 'HASH' and keys %{$fin_idx->{$_}} == 1 ) {
1126 my ($l, $r) = %{$fin_idx->{$_}};
1127
1128 if (
1129 ref $r eq 'ARRAY'
1130 and
1131 (
1132 ( @$r == 1 and $l =~ /^\-and$/i )
1133 or
1134 $l =~ /^\-or$/i
1135 )
1136 ) {
1137 push @or, @$r
1138 }
1139
1140 elsif (
1141 ref $r eq 'HASH'
1142 and
1143 keys %$r == 1
1144 and
1145 $l =~ /^\-(?:and|or)$/i
1146 ) {
1147 push @or, %$r;
1148 }
1149
1150 else {
1151 push @or, $l, $r;
1152 }
1153 }
1154 else {
1155 push @or, $fin_idx->{$_};
1156 }
1157 }
1158
1159 $fin->{-or} = \@or;
1160 }
1161 }
1162 else {
1163 # not a hash not an array
1164 $fin = { -and => [ $where ] };
1165 }
1166
1167 # unroll single-element -and's
116887147µs while (
1169 $fin->{-and}
1170 and
1171 @{$fin->{-and}} < 2
1172 ) {
1173 my $and = delete $fin->{-and};
1174 last if @$and == 0;
1175
1176 # at this point we have @$and == 1
1177 if (
1178 ref $and->[0] eq 'HASH'
1179 and
1180 ! grep { exists $fin->{$_} } keys %{$and->[0]}
1181 ) {
1182 $fin = {
1183 %$fin, %{$and->[0]}
1184 };
1185 }
1186 else {
1187 $fin->{-and} = $and;
1188 last;
1189 }
1190 }
1191
1192 # compress same-column conds found in $fin
1193174453µs8743µs for my $col ( grep { $_ !~ /^\-/ } keys %$fin ) {
# spent 43µs making 87 calls to DBIx::Class::Storage::DBIHacks::CORE:match, avg 494ns/call
11948780µs next unless ref $fin->{$col} eq 'ARRAY' and ($fin->{$col}[0]||'') =~ /^\-and$/i;
1195 my $val_bag = { map {
1196 (! defined $_ ) ? ( UNDEF => undef )
1197 : ( ! length ref $_ or is_plain_value $_ ) ? ( "VAL_$_" => $_ )
1198 : ( ( 'SER_' . serialize $_ ) => $_ )
1199 } @{$fin->{$col}}[1 .. $#{$fin->{$col}}] };
1200
1201 if (keys %$val_bag == 1 ) {
1202 ($fin->{$col}) = values %$val_bag;
1203 }
1204 else {
1205 $fin->{$col} = [ -and => map { $val_bag->{$_} } sort keys %$val_bag ];
1206 }
1207 }
1208
120987257µs return keys %$fin ? $fin : ();
1210}
1211
1212
# spent 1.28ms (1.09+190µs) within DBIx::Class::Storage::DBIHacks::_collapse_cond_unroll_pairs which was called 87 times, avg 15µs/call: # 87 times (1.09ms+190µs) by DBIx::Class::Storage::DBIHacks::_collapse_cond at line 1036, avg 15µs/call
sub _collapse_cond_unroll_pairs {
12138738µs my ($self, $pairs) = @_;
1214
12158724µs my @conds;
1216
12178743µs while (@$pairs) {
12188792µs my ($lhs, $rhs) = splice @$pairs, 0, 2;
1219
122087633µs174190µs if ($lhs eq '') {
# spent 190µs making 174 calls to DBIx::Class::Storage::DBIHacks::CORE:match, avg 1µs/call
1221 push @conds, $self->_collapse_cond($rhs);
1222 }
1223 elsif ( $lhs =~ /^\-and$/i ) {
1224 push @conds, $self->_collapse_cond($rhs, (ref $rhs eq 'ARRAY'));
1225 }
1226 elsif ( $lhs =~ /^\-or$/i ) {
1227 push @conds, $self->_collapse_cond(
1228 (ref $rhs eq 'HASH') ? [ map { $_ => $rhs->{$_} } sort keys %$rhs ] : $rhs
1229 );
1230 }
1231 else {
123287232µs if (ref $rhs eq 'HASH' and ! keys %$rhs) {
1233 # FIXME - SQLA seems to be doing... nothing...?
1234 }
1235 # normalize top level -ident, for saner extract_fixed_condition_columns code
1236 elsif (ref $rhs eq 'HASH' and keys %$rhs == 1 and exists $rhs->{-ident}) {
1237 push @conds, { $lhs => { '=', $rhs } };
1238 }
1239 elsif (ref $rhs eq 'HASH' and keys %$rhs == 1 and exists $rhs->{-value} and is_plain_value $rhs->{-value}) {
1240 push @conds, { $lhs => $rhs->{-value} };
1241 }
1242 elsif (ref $rhs eq 'HASH' and keys %$rhs == 1 and exists $rhs->{'='}) {
1243 if ( length ref $rhs->{'='} and is_literal_value $rhs->{'='} ) {
1244 push @conds, { $lhs => $rhs };
1245 }
1246 else {
1247 for my $p ($self->_collapse_cond_unroll_pairs([ $lhs => $rhs->{'='} ])) {
1248
1249 # extra sanity check
1250 if (keys %$p > 1) {
1251 require Data::Dumper::Concise;
1252 local $Data::Dumper::Deepcopy = 1;
1253 $self->throw_exception(
1254 "Internal error: unexpected collapse unroll:"
1255 . Data::Dumper::Concise::Dumper { in => { $lhs => $rhs }, out => $p }
1256 );
1257 }
1258
1259 my ($l, $r) = %$p;
1260
1261 push @conds, (
1262 ! length ref $r
1263 or
1264 ref $r eq 'HASH' and keys %$rhs == 1 and exists $rhs->{'='}
1265
1266 or
1267 is_plain_value($r)
1268 )
1269 ? { $l => $r }
1270 : { $l => { '=' => $r } }
1271 ;
1272 }
1273 }
1274 }
1275 elsif (ref $rhs eq 'ARRAY') {
1276 # some of these conditionals encounter multi-values - roll them out using
1277 # an unshift, which will cause extra looping in the while{} above
1278 if (! @$rhs ) {
1279 push @conds, { $lhs => [] };
1280 }
1281 elsif ( ($rhs->[0]||'') =~ /^\-(?:and|or)$/i ) {
1282 $self->throw_exception("Value modifier not followed by any values: $lhs => [ $rhs->[0] ] ")
1283 if @$rhs == 1;
1284
1285 if( $rhs->[0] =~ /^\-and$/i ) {
1286 unshift @$pairs, map { $lhs => $_ } @{$rhs}[1..$#$rhs];
1287 }
1288 # if not an AND then it's an OR
1289 elsif(@$rhs == 2) {
1290 unshift @$pairs, $lhs => $rhs->[1];
1291 }
1292 else {
1293 push @conds, { $lhs => [ @{$rhs}[1..$#$rhs] ] };
1294 }
1295 }
1296 elsif (@$rhs == 1) {
1297 unshift @$pairs, $lhs => $rhs->[0];
1298 }
1299 else {
1300 push @conds, { $lhs => $rhs };
1301 }
1302 }
1303 # unroll func + { -value => ... }
1304 elsif (
1305 ref $rhs eq 'HASH'
1306 and
1307 ( my ($subop) = keys %$rhs ) == 1
1308 and
1309 length ref ((values %$rhs)[0])
1310 and
1311 my $vref = is_plain_value( (values %$rhs)[0] )
1312 ) {
1313 push @conds, { $lhs => { $subop => $$vref } }
1314 }
1315 else {
13168776µs push @conds, { $lhs => $rhs };
1317 }
1318 }
1319 }
1320
132187223µs return @conds;
1322}
1323
1324# Analyzes a given condition and attempts to extract all columns
1325# with a definitive fixed-condition criteria. Returns a hashref
1326# of k/v pairs suitable to be passed to set_columns(), with a
1327# MAJOR CAVEAT - multi-value (contradictory) equalities are still
1328# represented as a reference to the UNRESOVABLE_CONDITION constant
1329# The reason we do this is that some codepaths only care about the
1330# codition being stable, as opposed to actually making sense
1331#
1332# The normal mode is used to figure out if a resultset is constrained
1333# to a column which is part of a unique constraint, which in turn
1334# allows us to better predict how ordering will behave etc.
1335#
1336# With the optional "consider_nulls" boolean argument, the function
1337# is instead used to infer inambiguous values from conditions
1338# (e.g. the inheritance of resultset conditions on new_result)
1339#
1340
# spent 6.06ms (1.32+4.73) within DBIx::Class::Storage::DBIHacks::_extract_fixed_condition_columns which was called 87 times, avg 70µs/call: # 87 times (1.32ms+4.73ms) by DBIx::Class::ResultSource::_minimal_valueset_satisfying_constraint at line 1581 of DBIx/Class/ResultSource.pm, avg 70µs/call
sub _extract_fixed_condition_columns {
13418772µs my ($self, $where, $consider_nulls) = @_;
134287274µs874.73ms my $where_hash = $self->_collapse_cond($_[1]);
# spent 4.73ms making 87 calls to DBIx::Class::Storage::DBIHacks::_collapse_cond, avg 54µs/call
1343
13448753µs my $res = {};
13458726µs my ($c, $v);
13468777µs for $c (keys %$where_hash) {
13478717µs my $vals;
1348
134987297µs if (!defined ($v = $where_hash->{$c}) ) {
1350 $vals->{UNDEF} = $v if $consider_nulls
1351 }
1352 elsif (
1353 ref $v eq 'HASH'
1354 and
1355 keys %$v == 1
1356 ) {
1357 if (exists $v->{-value}) {
1358 if (defined $v->{-value}) {
1359 $vals->{"VAL_$v->{-value}"} = $v->{-value}
1360 }
1361 elsif( $consider_nulls ) {
1362 $vals->{UNDEF} = $v->{-value};
1363 }
1364 }
1365 # do not need to check for plain values - _collapse_cond did it for us
1366 elsif(
1367 length ref $v->{'='}
1368 and
1369 (
1370 ( ref $v->{'='} eq 'HASH' and keys %{$v->{'='}} == 1 and exists $v->{'='}{-ident} )
1371 or
1372 is_literal_value($v->{'='})
1373 )
1374 ) {
1375 $vals->{ 'SER_' . serialize $v->{'='} } = $v->{'='};
1376 }
1377 }
1378 elsif (
1379 ! length ref $v
1380 or
1381 is_plain_value ($v)
1382 ) {
1383 $vals->{"VAL_$v"} = $v;
1384 }
1385 elsif (ref $v eq 'ARRAY' and ($v->[0]||'') eq '-and') {
1386 for ( @{$v}[1..$#$v] ) {
1387 my $subval = $self->_extract_fixed_condition_columns({ $c => $_ }, 'consider nulls'); # always fish nulls out on recursion
1388 next unless exists $subval->{$c}; # didn't find anything
1389 $vals->{
1390 ! defined $subval->{$c} ? 'UNDEF'
1391 : ( ! length ref $subval->{$c} or is_plain_value $subval->{$c} ) ? "VAL_$subval->{$c}"
1392 : ( 'SER_' . serialize $subval->{$c} )
1393 } = $subval->{$c};
1394 }
1395 }
1396
139787271µs if (keys %$vals == 1) {
1398 ($res->{$c}) = (values %$vals)
1399 unless !$consider_nulls and exists $vals->{UNDEF};
1400 }
1401 elsif (keys %$vals > 1) {
1402 $res->{$c} = UNRESOLVABLE_CONDITION;
1403 }
1404 }
1405
140687256µs $res;
1407}
1408
1409111µs1261µs1;
# spent 261µs making 1 call to B::Hooks::EndOfScope::XS::__ANON__
 
# spent 2.13ms within DBIx::Class::Storage::DBIHacks::CORE:match which was called 1305 times, avg 2µs/call: # 870 times (1.79ms+0s) by DBIx::Class::Storage::DBIHacks::_resolve_column_info at line 739, avg 2µs/call # 174 times (190µs+0s) by DBIx::Class::Storage::DBIHacks::_collapse_cond_unroll_pairs at line 1220, avg 1µs/call # 174 times (108µs+0s) by DBIx::Class::Storage::DBIHacks::_collapse_cond at line 1048, avg 618ns/call # 87 times (43µs+0s) by DBIx::Class::Storage::DBIHacks::_collapse_cond at line 1193, avg 494ns/call
sub DBIx::Class::Storage::DBIHacks::CORE:match; # opcode
# spent 104µs within DBIx::Class::Storage::DBIHacks::CORE:sort which was called 174 times, avg 596ns/call: # 87 times (84µs+0s) by DBIx::Class::Storage::DBIHacks::_collapse_cond at line 1004, avg 964ns/call # 87 times (20µs+0s) by DBIx::Class::Storage::DBIHacks::_collapse_cond at line 1045, avg 228ns/call
sub DBIx::Class::Storage::DBIHacks::CORE:sort; # opcode