Academic literature on the topic 'TAXI AGGREGATOR INDUSTRY'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'TAXI AGGREGATOR INDUSTRY.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "TAXI AGGREGATOR INDUSTRY"

1

Eguia, Rec. "SHADOW ECONOMY: EFFICIENCY AND RESILIENCY OF INFORMAL TRADING INDUSTRY OF NON CONVENTION SIZE SHIPS (NCSS) IN TAWI TAWI PROVINCE." BIMP-EAGA Journal for Sustainable Tourism Development 4, no. 2 (December 12, 2015): 48–58. http://dx.doi.org/10.51200/bimpeagajtsd.v4i2.3186.

Full text
Abstract:
This study aims to investigate the efficiency and resiliency of the informal economy of trading industry of non convention size ships (NCSS) in Tawi Tawi Province. A Social Benefit Cost Analysis was undertaken to quantify the benefits and costs associated with the transactions of the market players of the industry which include vessel owners, traders, Bureau of Customs, Maritime, LGUs, Consumers and shadow authorities governing the informal trading of goods in the province in particular; and estimate the shadow economic contribution to ARMM economy in general. Findings revealed that NCSS informal trading are equally beneficial in terms of private and social profitability of industry players and consumers. This explains the shape of resiliency in the industry structure, conduct and performance of the NCSS informaltrading. The study further demonstrated that the shadow economic activity of NCSS industry is socially and economically beneficial in maintaining the balance of peace and resiliency of the population in the area; and the shadow industry contributed informally in the aggregate economic performance of the province.
APA, Harvard, Vancouver, ISO, and other styles
2

Nwokedi, Theophilus C., Lazarus I. Okoroji, Ifiok Okonko, and Obed C. Ndikom. "Estimates of Economic Cost of Congestion Travel Time Delay Between Onne-Seaport and Eleme-Junction Traffic Corridor." LOGI – Scientific Journal on Transport and Logistics 11, no. 2 (November 1, 2020): 33–43. http://dx.doi.org/10.2478/logi-2020-0013.

Full text
Abstract:
AbstractTravelers along the Onne-seaport to Eleme-junction road corridor in the hub of the oil and gas industry in Port-Harcourt, Nigeria, have continued to experience very serious traffic congestion travel time delays, culminating into loss of man-hours and declining productivity. This study estimated the economic cost of traffic congestion travel time delay along the corridor, with a view to providing economic justification for developing traffic management policies and road infrastructure, to remedy it. A mixed research approach was adopted in which data was sourced through field survey and from secondary sources. The gross output model was used to estimate the output losses occasioned by productive time losses related to traffic congestion. The study established that the average daily traffic congestion travel time delay along the traffic corridor by travelers in trucks, car, bus and taxi modes are 104.17 minutes, 46.60 minutes, 58.5 minutes and 56.4 minutes respectively. The estimated daily aggregate economic cost of output losses associated with traffic congestion time delay on the corridor is 46049809.8 naira (210923.5USD) for all modes. This justifies any investment in traffic congestion remedial strategies along the route.
APA, Harvard, Vancouver, ISO, and other styles
3

Militz, Thane A., Simon Foale, Jeff Kinch, and Paul C. Southgate. "Natural rarity places clownfish colour morphs at risk of targeted and opportunistic exploitation in a marine aquarium fishery." Aquatic Living Resources 31 (2018): 18. http://dx.doi.org/10.1051/alr/2018006.

Full text
Abstract:
As fish stocks become depleted, exploitation eventually fails to be cost-efficient. However, species or morphs of species can suffer from continual exploitation if their rarity results in increased value, justifying the cost-efficiency of targeted or opportunistic exploitation. The trade in coral reef fishes for public and private aquaria is an industry in which naturally rare species and rare morphs of species command high prices. Here we investigate the relationship between price and the natural prevalence of colour morphs of two highly demanded clownfish species using a localised case study. The export prices for colour morphs increased with decreasing prevalence of occurrence (y = 4.60x−0.51, R2 = 0.43), but price increase was inversely less than the observed reduction in prevalence. This renders rare colour morphs (i.e., those at relatively low prevalence) at risk of opportunistic exploitation. Using ecological data, we also demonstrate how this increased value can subject rare colour morphs with aggregated distributions to targeted exploitation. These findings are discussed in relation to the broader marine aquarium trade, identifying taxa potentially at risk from exploitation motivated by rarity and addressing potential management strategies.
APA, Harvard, Vancouver, ISO, and other styles
4

Thessen, Anne, Jenette Preciado, Payoj Jain, James Martin, Martha Palmer, and Riyaz Bhat. "Automated Trait Extraction using ClearEarth, a Natural Language Processing System for Text Mining in Natural Sciences." Biodiversity Information Science and Standards 2 (May 22, 2018): e26080. http://dx.doi.org/10.3897/biss.2.26080.

Full text
Abstract:
The cTAKES package (using the ClearTK Natural Language Processing toolkit Bethard et al. 2014,http://cleartk.github.io/cleartk/) has been successfully used to automatically read clinical notes in the medical field (Albright et al. 2013, Styler et al. 2014). It is used on a daily basis to automatically process clinical notes and extract relevant information by dozens of medical institutions. ClearEarth is a collaborative project that brings together computational linguistics and domain scientists to port Natural Language Processing (NLP) modules trained on the same types of linguistic annotation to the fields of geology, cryology, and ecology. The goal for ClearEarth in the ecology domain is the extraction of ecologically-relevant terms, including eco-phenotypic traits from text and the assignment of those traits to taxa. Four annotators used Anafora (an annotation software; https://github.com/weitechen/anafora) to mark seven entity types (biotic, aggregate, abiotic, locality, quality, unit, value) and six reciprocal property types (synonym of/has synonym, part of/has part, subtype/supertype) in 133 documents from primarily Encyclopedia of Life (EOL) and Wikipedia according to project guidelines (https://github.com/ClearEarthProject/AnnotationGuidelines). Inter-annotator agreement ranged from 43% to 90%. Performance of ClearEarth on identifying named entities in biology text overall was good (precision: 85.56%; recall: 71.57%). The named entities with the best performance were organisms and their parts/products (biotic entities - precision: 72.09%; recall: 54.17%) and systems and environments (aggregate entities - precision: 79.23%; recall: 75.34%). Terms and their relationships extracted by ClearEarth can be embedded in the new ecocore ontology after vetting (http://www.obofoundry.org/ontology/ecocore.html). This project enables use of advanced industry and research software within natural sciences for downstream operations such as data discovery, assessment, and analysis. In addition, ClearEarth uses the NLP results to generate domain-specific ontologies and other semantic resources.
APA, Harvard, Vancouver, ISO, and other styles
5

Murray, T. E., J. A. Bartle, S. R. Kalish, and P. R. Taylor. "Incidental Capture of seabirds by Japanese southern bluefin tuna longline vessels in New Zealand waters, 1988-1992." Bird Conservation International 3, no. 3 (September 1993): 181–210. http://dx.doi.org/10.1017/s0959270900000897.

Full text
Abstract:
SummaryFishery observers recorded incidental capture of seabirds during 785 days on Japanese bluefin tuna longline vessels around New Zealand between April and August each year, 1988-1992. High numbers of albatrosses Diomedea spp. and petrels Procellaria spp. were caught on longline hooks during setting and drowned. Twelve seabird taxa were recorded, six of them breeding only in New Zealand. Most were breeding adults, except for Grey-headed and Black-browed Albatrosses. No bias in sex ratio was evident except for Grey Petrels, of which nearly all were female. Winter-breeding species were most often caught. Birds were not caught randomly, but in a highly aggregated fashion suggestive of complex behavioural interactions with the fishery. Most albatrosses were caught by day in the south whereas most petrels were caught by night north-east of New Zealand. Highest capture rates occurred at dawn and dusk off north-east New Zealand in June-August. Very large catches at specific sites contributed disproportionately to the overall catch rate. The estimated minimum number of total seabirds caught in New Zealand waters declined from 3,652 in 1988 to 360 in 1992, probably as a result of mitigation measures introduced progressively by the industry and by government regulation. Use of tori lines to prevent birds seizing baits had an effect, as did setting in total darkness in the south. Considerably more work needs to be done on the development of improved mitigation measures. Greater observer coverage is required to measure accurately the mortality of individual seabird species on tuna longlines throughout the Southern Ocean and to determine the effectiveness of mitigation measures.
APA, Harvard, Vancouver, ISO, and other styles
6

Eigaard, Ole R., Francois Bastardie, Niels T. Hintzen, Lene Buhl-Mortensen, Pål Buhl-Mortensen, Rui Catarino, Grete E. Dinesen, et al. "The footprint of bottom trawling in European waters: distribution, intensity, and seabed integrity." ICES Journal of Marine Science 74, no. 3 (December 6, 2016): 847–65. http://dx.doi.org/10.1093/icesjms/fsw194.

Full text
Abstract:
Mapping trawling pressure on the benthic habitats is needed as background to support an ecosystem approach to fisheries management. The extent and intensity of bottom trawling on the European continental shelf (0–1000 m) was analysed from logbook statistics and vessel monitoring system data for 2010–2012 at a grid cell resolution of 1 × 1 min longitude and latitude. Trawling intensity profiles with seabed impact at the surface and subsurface level are presented for 14 management areas in the North-east Atlantic, Baltic Sea and Mediterranean Sea. The footprint of the management areas ranged between 53–99% and 6–94% for the depth zone from 0 to 200 m (Shallow) and from 201 to 1000 m (Deep), respectively. The footprint was estimated as the total area of all grid cells that were trawled fully or partially. Excluding the untrawled proportions reduced the footprint estimates to 28–85% and 2–77%. Largest footprints per unit landings were observed off Portugal and in the Mediterranean Sea. Mean trawling intensity ranged between 0.5 and 8.5 times per year, but was less in the Deep zone with a maximum intensity of 6.4. Highest intensities were recorded in the Skagerrak-Kattegat, Iberian Portuguese area, Tyrrhenian Sea and Adriatic Sea. Bottom trawling was highly aggregated. For the Shallow zone the seabed area where 90% of the effort occurred comprised between 17% and 63% (median 36%) of the management area. Footprints were high over a broad range of soft sediment habitats. Using the longevity distribution of the untrawled infaunal community, the seabed integrity was estimated as the proportion of the biomass of benthic taxa where the trawling interval at the subsurface level exceeds their life span. Seabed integrity was low (<0.1) in large parts of the European continental shelfs, although smaller pockets of seabed with higher integrity values occur. The methods developed here integrate official fishing effort statistics and industry-based gear information to provide high-resolution pressure maps and indicators, which greatly improve the basis for assessing and managing benthic pressure from bottom trawling. Further they provide quantitative estimates of trawling impact on a continuous scale by which managers can steer.
APA, Harvard, Vancouver, ISO, and other styles
7

Tan, Weiqiang, and Jian Zhang. "Good Days, Bad Days: Stock Market Fluctuation and Taxi Tipping Decisions." Management Science, October 12, 2020. http://dx.doi.org/10.1287/mnsc.2019.3557.

Full text
Abstract:
Using taxicab tipping records in New York City (NYC), we develop a novel measure of real-time utility and quantitatively assess the impact of wealth change on the well-being of individuals based on the core tenet of prospect theory. The baseline estimate suggests that a one-standard-deviation increase in the stock market index is associated with a 0.3% increase in the daily average tipping ratio, which translates to an elasticity estimate of 0.3. The impact is short-lived and in line with the wealth effect interpretation. Consistent with loss aversion, we find that the impact is primarily driven by wealth loss rather than gain. We exploit Global Positioning System and timestamp information and design two difference-in-differences tests to establish causal inference. Exploitation of the characteristics of individual stocks suggests that the effect of wealth change on real-time utility is more pronounced in the stocks of firms with large market capitalization. Finally, our aggregate estimate suggests that annual tip revenue in the NYC taxi industry is associated with stock market fluctuations, ranging from −$17.5 million to $12.9 million. This paper was accepted by Tyler Shumway, finance.
APA, Harvard, Vancouver, ISO, and other styles
8

Woodburn, Matt, Deborah L. Paul, Wouter Addink, Steven J. Baskauf, Stanley Blum, Cat Chapman, Sharon Grant, et al. "Unity in Variety: Developing a collection description standard by consensus." Biodiversity Information Science and Standards 4 (October 9, 2020). http://dx.doi.org/10.3897/biss.4.59233.

Full text
Abstract:
Digitisation and publication of museum specimen data is happening worldwide, but far from complete. Museums can start by sharing what they know about their holdings at a higher level, long before each object has its own record. Information about what is held in collections worldwide is needed by many stakeholders including collections managers, funders, researchers, policy-makers, industry, and educators. To aggregate this information from collections, the data need to be standardised (Johnston and Robinson 2002). So, the Biodiversity Information Standards (TDWG) Collection Descriptions (CD) Task Group is developing a data standard for describing collections, which gives the ability to provide: automated metrics, using standardised collection descriptions and/or data derived from specimen datasets (e.g., counts of specimens) and a global registry of physical collections (i.e., digitised or non-digitised). automated metrics, using standardised collection descriptions and/or data derived from specimen datasets (e.g., counts of specimens) and a global registry of physical collections (i.e., digitised or non-digitised). Outputs will include a data model to underpin the new standard, and guidance and reference implementations for the practical use of the standard in institutional and collaborative data infrastructures. The Task Group employs a community-driven approach to standard development. With international participation, workshops at the Natural History Museum (London 2019) and the MOBILISE workshop (Warsaw 2020) allowed over 50 people to contribute this work. Our group organized online "barbecues" (BBQs) so that many more could contribute to standard definitions and address data model design challenges. Cloud-based tools (e.g., GitHub, Google Sheets) are used to organise and publish the group's work and make it easy to participate. A Wikibase instance is also used to test and demonstrate the model using real data. There are a range of global, regional, and national initiatives interested in the standard (see Task Group charter). Some, like GRSciColl (now at the Global Biodiversity Information Facility (GBIF)), Index Herbariorum (IH), and the iDigBio US Collections List are existing catalogues. Others, including the Consortium of European Taxonomic Facilities (CETAF) and the Distributed System of Scientific Collections (DiSSCo), include collection descriptions as a key part of their near-term development plans. As part of the EU-funded SYNTHESYS+ project, GBIF organized a virtual workshop: Advancing the Catalogue of the World's Natural History Collections to get international input for such a resource that would use this CD standard. Some major complexities present themselves in designing a standardised approach to represent collection descriptions data. It is not the first time that the natural science collections community has tried to address them (see the TDWG Natural Collections Description standard). Beyond natural sciences, the library community in particular gave thought to this (Heaney 2001, Johnston and Robinson 2002), noting significant difficulties. One hurdle is that collections may be broken down into different degrees of granularity according to different criteria, and may also overlap so that a single object can be represented in more than one collection description. Managing statistics such as numbers of objects is complex due to data gaps and variable degrees of certainty about collection contents. It also takes considerable effort from collections staff to generate structured data about their undigitised holdings. We need to support simple, high-level collection summaries as well as detailed quantitative data, and to be able to update as needed. We need a simple approach, but one that can also handle the complexities of data, scope, and social needs, for digitised and undigitised collections. The data standard itself is a defined set of classes and properties that can be used to represent groups of collection objects and their associated information. These incorporate common characteristics ('dimensions') by which we want to describe, group and break down our collections, metrics for quantifying those collections, and properties such as persistent identifiers for tracking collections and managing their digital counterparts. Existing terms from other standards (e.g. Darwin Core, ABCD) are re-used if possible. The data model (Fig. 1) underpinning the standard defines the relationships between those different classes, and ensures that the structure as well as the content are comparable across different datasets. It centres around the core concept of an 'object group', representing a set of physical objects that is defined by one or more dimensions (e.g., taxonomy and geographic origin), and linked to other entities such as the holding institution. To the object group, quantitative data about its contents are attached (e.g. counts of objects or taxa), along with more qualitative information describing the contents of the group as a whole. In this presentation, we will describe the draft standard and data model with examples of early adoption for real-world and example data. We will also discuss the vision of how the new standard may be adopted and its potential impact on collection discoverability across the collections community.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "TAXI AGGREGATOR INDUSTRY"

1

MANCHANDa, GAURANG. "UNCOVERING TRENDS IN TAXI AGGREGATOR INDUSTRY USING TWITTER SENTIMENT ANALYSIs." Thesis, 2016. http://dspace.dtu.ac.in:8080/jspui/handle/repository/17116.

Full text
Abstract:
Micro-blogging websites have evolved to become a source of all kinds of information. People post all kinds of real time messages on micro-blogs including their experience of a service they use, opinions on a variety of topics and current issues, complains and positive sentiments about the product they use. Twitter offers a unique dataset in the world of brand sentiment. Brands receive sentiment messages directly from their customers in real time on twitter. These brands have the opportunity to analyze these messages to determine the consumer sentiment. Taxi aggregator industry being a high volume service industry receives hundreds of comments on their social media pages daily from their customers regarding their experiences, complaints and opinions on the services provided.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "TAXI AGGREGATOR INDUSTRY"

1

Ganda, Fortune, and Rufaro Garidzirai. "The Environmental Influence of Tax Regimes in Selected European Union Economies." In Green Technologies and Computing Industry [Working Title]. IntechOpen, 2020. http://dx.doi.org/10.5772/intechopen.94552.

Full text
Abstract:
Eurostat and the European Environmental Agency have in 2019 reported there is still need to continue implementing zero-carbon practices in European Union (EU) Countries although there has been a noted decrease of 22% in emissions when compared to their 1990 levels. This paper employed a system-Generalised Method of Moments (GMM) framework to evaluate the environmental impacts of tax systems in selected 28 EU economies from 2010 to 2017. The results of the study proved that aggregate environmental tax is not effectively lowering greenhouse gas emissions as expected, although it improves environmental sustainability. Possibly the environment tax revenue collected in the European Union countries was not used to enhance energy efficiency; hence it could not lower greenhouse gas emissions. The other findings demonstrate that when environmental tax is disaggregated (energy tax and transport tax) these instruments have been more efficient in lessening emissions and also improves environmental sustainability (in the case of transport tax). The paper, therefore, highlights the importance of adopting green tax instruments which are more focused and harmonising directly with environmental goals for EU economies.
APA, Harvard, Vancouver, ISO, and other styles
2

Cameron, Alicia (Lucy). "National Competition Policy and Broadband Provision in Australia." In Encyclopedia of Developing Regional Communities with Information and Communication Technology, 506–11. IGI Global, 2005. http://dx.doi.org/10.4018/978-1-59140-575-7.ch090.

Full text
Abstract:
National Competition Policy (NCP) implemented in Australia from 1995 has had a profound effect on the mode and level of service delivery in nonmetropolitan regional and rural areas. The implementation of NCP followed the lead of other countries in corporatising, segmenting, and privatising many state and national government services and utilities and promoting open global competition as the framework for service delivery in the future. As government moves out of the role of service provision and into the role of industry regulation, there has been significant jurisdiction shifting in terms of responsibility for services, as well as reduced subsidisation for the cost of service over distance: subsidisation that was previously enabled through government-owned nationwide monopolies. This is more of an issue in Australia than in many other countries due to the large landmass and relatively small but dispersed population. Unlike many other countries, however, Australia has been slow to increase the proportion of overall tax revenue given to local government bodies to ensure regional service delivery or to impose community service obligations (CSOs) at local levels. Confused local bodies have been left to build expensive business plans to attract new services in areas for which they currently have little or no funding, and in which they previously had no responsibility or expertise. Local bodies are currently being requested to aggregate demand across government, private, and residential customer bases. Management of the delivery of broadband services is an example of the confusion faced by regional bodies in Australia in the wake of a recently corporatised government utility and a liberalised telecommunications environment.
APA, Harvard, Vancouver, ISO, and other styles
3

"signified by a rapid increase in aggregate demand and shortages … The budget deficit reached … 19 per cent of GDP in 1991. The decision to liberalize 90 per cent of prices in January 1992 led to a price jump of 245 per cent and by the summer the monetary overhang had been eliminated. Efforts to tighten monetary policy in 1992–4 failed mainly due to attempts to preserve the rouble zone which after the break-up consisted of fifteen independent countries, each with their own central bank. And although the CBR [Central Bank of Russia], under the leadership of Viktor Gerashchenko, was the only one allowed to print roubles the central banks of other CIS countries (and initially also the Baltic States) could issue credits. This meant that monetary policy spun out of control … More countries started to introduce their own currency or issue monetary surro-gates. Furthermore, much of the credits issued by CIS central banks were used to finance imports of Russian commodities, mainly oil and gas, which meant that pressure was also put on the CBR and the Russian government by Russian exporters to continue looser monetary policy. As a result by mid-1992 the granting of concessional credits to agriculture and industry intensified. At the same time Russia was unable to increase tax revenues or reduce expenditures and as a result continued to run a large budget deficit. And without access to domestic capital markets and a lack of willingness by the West to lend money to Russia, the only source of finance was the printing presses. This policy resulted in a rapid growth of the money supply." In The Countries of the Former Soviet Union at the Turn of the Twenty-First Century, 478–93. Routledge, 2004. http://dx.doi.org/10.4324/9780203647547-29.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "TAXI AGGREGATOR INDUSTRY"

1

Bazel, Philip, Jack M. Mintz, and Gerardo Reyes-Tagle. Taxation of the Mining Industry in Latin America and the Caribbean: Analysis and Policy. Inter-American Development Bank, June 2023. http://dx.doi.org/10.18235/0004957.

Full text
Abstract:
Little is known about mining taxation in Latin America and the Caribbean (LAC), although it is both particularly complex and has large effects on incentives for investments in mining activities. This paper reviews the types and consequences of mining taxes that are applied in the region and their implications for investment. Most countries assess royalties based on the value of production, which are consistent with royalties applied globally. However, miners confront additional taxes such that tax regimes, in the aggregate, inefficiently discourage investment, including income taxes, non-refundable sales taxes on capital purchases, capital taxes, gross receipt taxes, and real estate transfer taxes. Several reforms emerge from the analysis. The most important is for LAC countries to consider profit-based regimes--similar to Chile, Mexico, and Peru--supplemented by a minimum royalty based on the value of production. Company tax reforms should also be considered with the aim to tax mining similarly to other sectors of the economy to improve the allocation of capital.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography