We conducted some research with Deakin’s DSTIL team and had a paper accepted to Applied Soft Computing as part of a special issue. Here we looked at learning weights for inequality- and aggregation-based indices for measuring traffic, looking particularly to find indices correlated with low traffic speeds. (available online)
Title: Measuring traffic congestion: An approach based on learning weighted inequality, spread and aggregation indices from comparison data
Authors: G. Beliakov and M. Gagolewski and S. James and S. Pace and N. Pastorello and E. Thilliez and R. Vasa
As cities increase in size, governments and councils face the problem of designing infrastructure and approaches to traffic management that alleviate congestion. The problem of objectively measuring congestion involves taking into account not only the volume of traffic moving throughout a network, but also the inequality or spread of this traffic over major and minor intersections. For modeling such data, we investigate the use of weighted congestion indices based on various aggregation and spread functions. We formulate the weight learning problem for comparison data and use real traffic data obtained from a medium-sized Australian city to evaluate their usefulness.