Mainstream network models for ITS have traditionally utilized switch/router Ethernet technologies to manage communications between the core and the edge. “Active Optical Networks (AON)”, as the naming indicates, deploy “active” network management devices within the system architecture to facilitate network management capabilities.

Recent years have seen technologies typically utilized by communications providers, proposed for use in support of next-gen ITS. One of those architectures utilizes multiplexing and “passive” fiber optic splitters to maximize the use of fiber optic infrastructure. Passive Optical Networks (PON), deploy the same amount of equipment, in addition to “passive” optical splitters in order to facilitate multiplexing of data at the head-end and distribution of data in the field.


The following provides a comparison of the advantages and disadvantages of “actively switched” and “passively split” fiber optic communications technologies.


AON’s utilize intelligent switching and routing to direct and manage data flow within a network.  The emergence of the Ethernet architecture, protocols and topologies in the transportation and the Intelligent Transportation systems (ITS) arena occurred around 1999, and is rooted in traditional, switched Ethernet networks that date back over 35 years.


  • Active Optical Networks have been successfully utilized for Transportation and ITS networks for more than 10 years, and currently represent the industry standard.
  • AON’s provide a framework and architecture that enables physical route redundancies, which provides protection against fiber cuts and enables optimized data management strategies.
  • Adding new devices to AON networks is simple “plug and play”, with little-to-no network reconfiguration required.
  • AON’s provide the ability to deploy equipment and protocols that are standardized within the telecommunications industry.
  • AON hardware is ubiquitous, with plenty of hardware vendors available, thus minimizing specific hardware/software/vendor dependencies.
  • An AON ring architecture allows for simplified network reconfiguration, implementation of additional “head-ends”, or the relocation of head-end by simply connecting to the nearest point on the nearest ring.
  • AON’s are fully bi-directional with equal upstream and downstream transmission rates.


  • Edge switches cost more than PON cabinet devices Optical Network Terminals (ONTs).
  • Fully redundant core switch with optics cost more than the fully redundant PON’s OLT’s.
  • Requires the installation of additional conduit and cable runs to achieve physical redundancies.


PON’s are a relatively new technology that has emerged from the telecoms and cable industry, most notably as a result of the surging demand for consumer-driven, high-bandwidth applications for the residential market.  Due to the required data payloads, including voice, video and data, the communications providers realized a need for maximizing the use of each and every single fiber deployed in their network.  Gigabit PON or GPON is an asymmetrical, “shared” bandwidth configuration with a maximum downstream bandwidth of 2.4Gbps and an upstream total bandwidth of 1.2Gbps per fiber.


  • Edge devices (cabinet transceivers or ONTs) are less expensive than typical Ethernet edge switches.
  • PON’s utilize multiplexing (mux) technologies that enable multiple channels (typically 32 or 64) to utilize a single fiber.
  • Due to the multiplexing architecture, the PON allows for the mux of 32 individual channels per single fiber, enabling communications to 32 different devices over the single mux’d fiber.


  • Due to the Point-to-Point architecture of PONs, the technology is unable to feasibly implement physical network redundancies for communications routes.
  • If a splitter goes offline(splitter failure or cabinet hit), or a break in a cable occurs between the splitter and the head-end OLT (TMC), then all devices downstream from the break and the splitter will go offline until repairs can be rendered. Splitters represent a significant Single Point of Failure (SPOF) – which network design typically tries to minimize.
  • Due to the multiplexing nature of the data transmission, faults are difficult to identify/isolate/remedy.
  • The nature of PONs requires that fiber be installed in a manner that does not provide as much flexibility for future network configurations. All fiber backbone radiates from a single point,
  • The PON architecture does not allow for the implementation of typical Ethernet network management features/protocols associated with layer 3 (OSI) networking protocols.
  • Passive Optical LAN architecture is not recognized by any of the North American or international standards bodies as being a standards-based architecture for the enterprise environment.
  • PON’s asymmetrical framework (designed for delivering video/voice to the home) results in an architecture that provides more downstream bandwidth (2.4Gbps) than upstream bandwidth (1.2Gbps) – generally opposite of the needs related to a typical ITS installation, and thus an inefficient use of the technology. The potential downstream bandwidth (2.4 Gbps) will probably never be realized, nor required, where the upstream limitation of 1.2Gbps could potentially yield a constraint in some future installations.
  • Because a defined group of ONT’s share an upstream path, there is a potential for the collision of data. To correct for this, the OLT allocates specific time slots for each ONT to transmit Its data back to the OLT.  This works well with Telecommunications/cable company networks where a majority of the data payload demand is downstream.  However, in an ITS environment, the opposite is true as the upstream demand is exponentially greater and more critical.
  • No Power Over Ethernet (PoE) capabilities via ONT.

Sources and Resources

Passive Optical Networks


The recent increase in new transportation data sources coupled with the enhanced densities of data Sentiment Analysisgraphicsources are leading to a number of new analytics tools for transportation professionals.  One of the latest analysis resources to recently emerge fuses real-time/near-real time social data with traditional mobility data (speed, location, trajectory).  Human sensor networks wield a much more comprehensive ability to not only report on traditional mobility data attributes, but also provides additional resolution for location and time, as well as characteristics with regards to “sentiment”.

Sentiment analysis  aggregates and filters real-time data from the web and social media resources and reduces the data for context and transportation value.  Sentiment mapping links data sources with location, or future locations with detected sentiment related to each location.     For example, data crawlers and filtering for key words (accident, crash, traffic, I-66, I-95) and assessing sentiment tied to messages (slow, clear, backup, bad, good).  The resulting sentiment model is then tied to time and location data for a resulting sentiment graph.  Sentiment mapping and sentiment analysis can also be utilized for predictive analytics where content is determined to identify future location-based sentiment for future conditions.

The following links provide a primer as well as preliminary information on development and configuration of sentiment data models.

References and Resources

How Social Media Can Improve and Redesign Transport Systems
Transportation Sentiment Analysis for Safety Enhancement
Creating a Sentiment Analysis Model
Sentiment Analysis
How Smart Cities are Using API’s

A lot has been written about “big data” lately.  The rapid growth of varying data sources coupled with the enhanced density in data sources is establishing  a huge resource for transportation operators.  The rapid proliferation of data sources from new devices such as smartphones and other newly connected devices, in conjunction with the advancement of technologies for data collection and management have manifested a sizeable inflection point in the availability of data.  So what does this mean for ITS operators and the systems they currently manage?  What will be required to extract and leverage values associated with “big data”?

At First Glance

Federal regulations for performance measures and real-time monitoring associated with MAP-21 and 23 CFR 511 have implemented a framework for the increased need of new, refined data and information systems.  System enhancements will require improvements to existing networks and communications systems in order to optimize data and metadata flows between data sources and central applications. Robust central network equipment, including L3 switches, servers and storage will also be required.  Enhanced security measures  associated with new data sources and big data values will also need to be reviewed and attended to.  New central data warehouse infrastructure will also be required, including new database applications (such as Hadoop), that are capable of managing “big data” and the “Internet of Things” (IoT).

Deeper Dive

A closer look reveals additional layers of change required in order to begin abstracting value from the new data sources.  “Big data” will also require somewhat less obvious changes in the way transportation agencies currently do business.

Increased Data Management and Analytics Expertise –  The new data paradigm will require new staff skills, most notably, experience in data analytics (Quants).  Staff skills will not only require knowledge of the data available now or potentially available in the near term, but also understand transportation systems in order to apply the most beneficial data mining tactics available.  The new role must not only be aware of current data and information needs and values, but also be cognizant of what is capable, and potential hidden values currently unrealized or unknown by an operating agency.  The new role will also be an integral part of the development of embedded system features and be able to identify nuances in data meaning, as well as establish effective predictive analytics.

Policy and Digital Governance –  New data sources are also giving rise to discussion regarding privacy and liability.  Data sourced from private entities will always contend with privacy fears and concerns, at least for the near term, although recent analysis is showing a steady lessoning of those fears as “digital natives” begin to represent a greater percentage of the traveling public.  Data generated from sources outside of transportation agencies, but utilized by transportation agencies  for systems operations, can lead one to question who is responsible should data errors occur that might affect a system.

Networks and Communications – Data sources, formats and general data management practices will need extensive review of existing conditions. What values are attained from real-time, or near real-time collection from subsequent analytics, as well as determining what data is less time dependent.  Existing formats and protocols should also be included in the mapping exercise. For example, CV will require a mandatory upgrade of IP protocols from IPv4 to IPv6.  General planning regarding the utilization of “the cloud” need to be weighed for benefit-cost.  Third-party data brokers and other outsourcing alternatives such as cloud computing need to also be assessed.

Data Management and Analysis Tools – Operating entities also need to look at implementing data management tools (applications) that will assist in extracting value from large data sets.  These tools  should be integrated with core systems, and provide real-time metrics of collected data.  The tools should also provide the ability for “Cloud collaboration”, in order to process data stored by third parties, or general data stored in the cloud.

Wisdom Knowledge Information Data Pyramid

What to do

Transportation budgets are as tight as ever. How can operating agencies begin to make incremental steps towards the goal of realizing benefits associated with “big data”?  The first step is to begin now.  Start by mapping existing data sources to existing data management technologies, policies and processes, from end to end.  Also, widen your perspective and begin to look at possible benefits from a wide array of new data sources.  In addition, “open” it up, and benefit from the wisdom of the crowd.  New analytics skill sets should be considered a condition of certain new hires in the transportation and ITS planning departments.  A staff member should be designated for leading the way with decisions regarding “big data”, relationships with third party data brokers, cloud management, as well as be responsible for implementing an agile framework for next-gen data systems.

References and Resources
Developing a Data Management Program for Next-Gen ITS: A Primer for Mobility Managers
Big Data and Transport
TransDec: Big Data for Transportation
Update from the Data Liberation Front                                                                                           

The ITS planner, designer and operator should always be cognizant of the life-cycle of the overall system and its integrated subsystems and components. Timing of next-gen ITS integration can be optimized, both fiscally as well as technically, by considering the wide spectrum of variables associated with life-cycle management.  The following graphic presents a general overview of a typical systems life-cycle:


Typical life-cycle management should also include evaluation of the maturity of next-gen ITS technologies and the systems required to support a new ITS solution.


With the pending market saturation of the smartphone looming, and the emergence of connected vehicles, peer-to-peer resource management, crowdsourcing and the implementation of collaborative platforms, one could easily surmise that the “consumerization” of significant components of Next-Gen Intelligent Transportation Systems (ITS) is well underway. What is not exactly clear is what the landscape will look like during the transition, as well as when consumerization is firmly rooted.

It’s clear that public mobility managers will continue to provide certain services to their constituents for the foreseeable future, however  it is expected that some existing services will be provisioned through consumerization. Consumerization will also give rise to entirely new service needs. New areas of expertise will be required for data and information management, systems management and X2X networks, to name just a few. Will consumerization lead to less strain on agency coffers? Or will it simply generate new needs equal to or greater than existing financial burdens? We’ll take deeper dives on these issues in coming posts.


References and Resources

Sources of Innovation

Posted: May 15, 2013 in Innovation, Planning
Tags: , ,

One of my favorite aspects about the technology industry is that the “next big thing” can come from just about anywhere and at anytime.  Intelligent Transportation Systems (ITS) envelopes a significant array of core industries, and feeds off of quite a few other innovation ecosystems.  However, monitoring and tracking these future trends and relevant upstream currents can be challenging at best.

One of the first tools I developed was an innovation resource matrix, which attempts to map core and key industries and innovation resources integrated with or tangent to the ITS industry. An example of this first generation map is provided in the following graphic. In later posts we’ll examine philosophies and strategies for managing and navigating the hype, fog and moats associated with trends analysis and future-casting.


There is no denying “big data” and its importance to next-gen ITS applications. The emergence of a vast,data omnipresent data cloud is enabling new knowledge and wisdom to be attained, as well as facilitate new operations models for the mobility manager.  Unfortunately, parochial data systems and data management strategies are quickly becoming obsolete with regards to managing this quickly evolving paradigm.  As a result, the need for operating institutions and mobility managers to understand “big data” and implement new, comprehensive and overarching data management strategies has never been greater.  Next-gen data and information systems will need to be autonomous, contextual, predictive and real-time.  The overall impact is cascading in that now a new strategy is not only desired, but will become an essential function, as the proliferation of meaningful data sources accelerates.  The time for agencies to plan, prepare, implement and transition is now.  The following aggregates a few thoughts into an introductory package for agencies to consider as they get started, in hopes of widening the road to success.


Although all of the values of new “big data” resources are not yet fully understood, the danger of getting bogged down in the data deluge is already being felt.  Before these new values can be leveraged, we must first review, research and retool, predicated on a sound understanding of existing conditions and extensive research and evaluation of likely future conditions and future capabilities. In addition, programmatic and industry changes such as MAP-21 and the Connected Vehicle are changing the operational fabric and are mandating new requirements for mobility managers, and thus, also need to be considered when developing a new data and information management strategy.


So where to start? – The following insights are framed within the “What/How” solutions model, or “What do we want/need?”, and then,”How do we do it?”  As is the case with all sound planning efforts, an accurate understanding of existing conditions is an essential first-step prior to commencing with future planning efforts.



Stakeholders and Champions – The first step is to identify all possible stakeholders (including champions and arbiters), both internal and external to an operating entity.  It’s key to remember that the data paradigm shift will cover all departments, agencies, programs and offices within a city and/or region, therefore coordination with an overarching perspective is essential for success.  Typically non-traditional stakeholders will now play important roles and become key teammates.  The identification of the initial list of stakeholders should include a first draft of a new steering committee or “Data Management Team” (DMT), which should encompass all pertinent agencies and institutions.

What do we have?

Following the formation of DMT, the team should begin to assess existing conditions.  Some key questions to get started include:

  • What are our existing data generators?
  • What systems are required to support these data generators?
  • How do we currently source, transmit and aggregate data from existing data sources?
  • What data and information-based goals and objectives are currently in place?
  • What are our existing processes for measuring and monitoring the path towards prescribed goals?
  • What values are we realizing/not realizing?
  • What standards and formats do we utilize?
  • What policies and regulations currently exist?
  • What quality control processes and procedures are in place?
  • What licensing, warranty and policy factors impact our data and information systems?

These questions will likely uncover significant new understanding as to how an agency currently handles data, and identify opportunities lost or new opportunities for functional improvements. The baseline assessment needs to include identification and mapping of existing supporting systems and infrastructure, including networking and software applications.  The exploration should also begin to drill down and refine existing information such as data attributes. A list of attributes might include:


  • Source
  • Owner
  • Use rights
  • Format
  • Polling rates
  • Current uses (realized)
  • Potential uses (unrealized)
  • Quality
  • Cleansing/conditioning

Data Support Systems and Applications

  • Infrastructure requirements
  • Software dependencies
  • Other OSI reference model considerations

Policies, Guidelines and Contracts

  • Use policies
  • Cost per byte/poll
  • Licensing and Warranties
  • Existing vendor contracts, limitations
  • Storage and Retrieval
  • Performance metrics and monitoring
  • Existing staff requirements

Interim Review – Immediately following initial exploration of existing conditions, the Data Management Team should conduct an interim review of its findings. In addition, the DMT should review any and all existing goals and objectives related to data and information systems. What are we truly trying to accomplish and what are we achieving? What are we not achieving? What are the perceived initial gaps?  The initial review of existing conditions will likely trigger additional exploration needs with regards to existing data and information systems.  The interim review will also likely uncover additional stakeholders, both internal and external to the mobility management ecosystem.

Mapping – Map your findings.  As with all good wayfinding processes, a “you are here” marker is essential.  The goal is to map all exploration activities and contextualize the existing data and information system landscape.  In addition to narrative and graphical mapping, a spreadsheet or database is also helpful for tracking results such as data and information attributes.

Projections and Forecasts

data4The next step will be to begin exploration and research of existing trends and to conduct forecasting of future trends and forecasted conditions.  Predicting the future is always challenging at best.  However, with a sound, comprehensive strategy in place, an organization can best plan and implement strategies that prepare an agency for potential future conditions.  Trends analysis and future conditions forecasting will assist in establishing a pragmatic orientation for the foreseeable future.  These assessments should be conducted in parallel, yet separate paths from the existing conditions exploration and mapping tasks.  (The simultaneous work efforts will assist in finalizing the existing conditions survey task by uncovering additional gaps in the initial existing conditions survey and identify additional existing conditions research required).

Current Trends – Current trends such as cloud-computing, smartphones, mobile apps, private data sourcing, crowdsourcing, and integrated corridor management (ICM) need to be identified and included in new data management strategies. MAP-21 and other Federal requirements will mandate a new minimum acceptance level for the operating entities and also need to be immediately included in planning efforts.  It’s important to look past today’s sheen of certain applications and technologies to truly understand where industries and agencies are headed.

Future Trends – Connected Vehicle, including V2X, or V2I components will directly impact operating agencies and the way they do business in the coming years. Other likely future trends such as the autonomous vehicles, City as a Platform and integration of transportation networks will directly impact the data and information framework.  Additional trends such as system automation and data driven systems will amplify the need for pertinent real-time data.


The “Future-Casting” task should also assign segments of industry to in-house champions (domain expertise), in order to monitor federal regulations, funding streams, the information technology and automobile sectors, university, state and federal research tracks, consumer technology markets, as well as tangential markets and adjacent internal agencies and divisions.

What do we want/need?

Immediately following the initial existing conditions survey and research and forecasting of future trends and conditions, the DMT should revisit original goals and objectives regarding data and information systems, and modify/append accordingly.  At this point, a traditional “User Needs and Preferences” assessment can be conducted, and should follow a traditional Systems Engineering framework. Some of the basic questions to address include:

  • Have we properly identified and defined all of our goals and objectives
  • How do you plan to leverage enriched data environments?
  • How will this foster enhanced wisdom and adaptive genius within our mobility ecosystem?
  • How will me monitor our progress towards achieving our goals and objectives (performance measures)
  • Have we instituted agency changes appropriate and sufficient to meet our goals and objectives?

To this point, you should have a pretty sound understanding of all of the existing data and information systems within the agency/region.    However, it may require additional iterations of the exploration, mapping and wants and needs assessments to truly understand where you are, and where you want to be (goals).


Once goals and objectives have been set, we can begin to assess “How” do we get there?  As with most planning efforts, an alternatives analysis and a Long Range Plan and Implementation Plan need to be developed.  A scale vs. value and ROI assessment is conducted at this point as well.  As is always the case with future-proofing, the key is not to plan to design for specific (undefined, and in some cases unknown) technologies, methodologies and strategies, but to identify and anticipate likely future conditions and implement a framework that is agile, flexible and capable of embracing future technologies, strategies and methodologies.

data3The next step is to establish a requirements-based blueprint and roadmap to transition from today to tomorrow. It’s also important to set measurable goals and identify necessary performance metrics in order to track progress towards goals and objectives, and to be able to conduct evaluatory assessments.  This step should also include a traditional gap analysis as well.  The Long Range Plan should also include a Concept of Operations.  This step will also begin to define “rewiring” necessary for executing the new data and information management program, which should also include business rules.  In addition, new data management schema needs to be integrated with the overall (typical) planning processes, including budgeting, long-range plans and regional plans.

Staffing resources and annual operations should also be assessed at this point.  Domain expertise, staffing and skills requirements will need to be addressed.  This should be included in the initial existing conditions exploration.  A new Data Manager position is likely the most appropriate first hire.  This individual may be an MPO, DOT or local agency staff person in charge of overseeing all harmonization of data and information systems across all platforms, jurisdictional and agency boundaries.  A Data Scientists/Analysts will also likely be required.

Additional Challenges and Potential Impediments to Consider

Initial Buy-in and Engagement – As with most new initiatives, getting up from the “comfy couch” can be the biggest challenge to implementing new or improved strategies.  Generating the initial inertia and momentum will require champions at the administrative, technical and arbiter levels, within all stakeholders, departments, agencies and regional staff (MPO).

Data use and retention policies – some data may be approved for certain uses, however, additional uses may raise privacy, licensing or ownership issues.  This challenge also gives rise to additional hurdles including operational governance and regulation of the new data and information system.  For example, can private data be sourced to operate public systems (signal systems, etc.) were safety is critical?

Integration and Standardization – what level of data and system integration is optimal, or will achieve the greatest Benefit/Cost ratio for an operating entity? What granularity and resolution (data density) is required for each component of the goals?  Automated monitoring and performance reporting will be a key to success with regards to overall integration and standardization.

Sustainability – A new funding stream (outgoing) is likely required.  However, the potential for additional revenue streams (incoming) is also likely.  Funding needs to be identified for the initial capital outlay, as well as annual operations and maintenance cost for the life-cycle of the system and subsystems.

Security – As the data reservoir expands, and the network to support and manage the data and information systems expands, so will the security concerns.  New policies and data management applications will be essential. Data storage, encryption, access rights, use rights as well as infrastructure and support applications should all be included in the initial security assessment and security planning efforts.

Transportation Data and Information Systems – LinkedIn Working Group
USDOT Research Data Exchange
Research, technology, and data drive America’s transportation system – USDOT Transportation Secretary
Real-Time Data Capture and Management