Journal articles on the topic 'Expanding Knowledhe in the Information and Computing Sciences'

To see the other types of publications on this topic, follow the link: Expanding Knowledhe in the Information and Computing Sciences.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 23 journal articles for your research on the topic 'Expanding Knowledhe in the Information and Computing Sciences.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Nikolopoulos, Vasileios, Mara Nikolaidou, Maria Voreakou, and Dimosthenis Anagnostopoulos. "Context Diffusion in Fog Colonies: Exploring Autonomous Fog Node Operation Using ECTORAS." IoT 3, no. 1 (January 18, 2022): 91–108. http://dx.doi.org/10.3390/iot3010005.

Full text
Abstract:
In Fog Computing, fog colonies are formed by nodes cooperating to provide services to end-users. To enable efficient operation and seamless scalability of fog colonies, decentralized control over participating nodes should be promoted. In such cases, autonomous Fog Nodes operate independently, sharing the context in which all colony members provide their services. In the paper, we explore different techniques of context diffusion and knowledge sharing between autonomous Fog Nodes within a fog colony, using ECTORAS, a publish/subscribe protocol. With ECTORAS, nodes become actively aware of their operating context, share contextual information and exchange operational policies to achieve self-configuration, self-adaptation and context awareness in an intelligent manner. Two different ECTORAS implementations are studied, one offering centralized control with the existence of a message broker, to manage colony participants and available topics, and one fully decentralized, catering to the erratic topology that Fog Computing may produce. The two schemes are tested as the Fog Colony size is expanding in terms of performance and energy consumption, in a prototype implementation based on Raspberry Pi nodes for smart building management.
APA, Harvard, Vancouver, ISO, and other styles
2

Makori, Elisha Ondieki. "Promoting innovation and application of internet of things in academic and research information organizations." Library Review 66, no. 8/9 (November 7, 2017): 655–78. http://dx.doi.org/10.1108/lr-01-2017-0002.

Full text
Abstract:
Purpose The purpose of the study was to investigate factors promoting innovation and application of internet of things in academic and research information organizations. Design/methodology/approach Quantitative research design involved survey of selected academic and research information organizations in public and private chartered institutions. Information professionals, digital content managers, information systems and technologists that normally consume big data and technological resources were involved in the process of data collection using structured questionnaire and content analysis. Information organizations and information practitioners were selected from public and private academic and research institutions. Findings Innovation of internet of things has increasingly transformed and changed academic and research information organizations as the source of knowledge in addition to expanding access to education, data, information and communication anywhere anytime through hyperconnectivity and networking. Internet of things technologies such as mobile of things, web of things, digital information systems and personal devices are widely applied by digital natives in academic and research information organizations. Mobilization platform and devices is the single biggest provider of data, information and knowledge in academic and research organizations. Modern trends in education and knowledge practices in academic institutions and information organizations depends upon internet of things, digital repositories, electronic books and journals, social media interfaces, multimedia applications, information portal hubs and interactive websites, although challenges regarding inadequate information communication technology infrastructure and social computing facilities still persist. Research limitations/implications Information organizations in public and private chartered academic and research institutions were adopted in the study. Respondents handling and supporting information management, planning and decision-making provided the necessary data. Information professionals, digital content managers, information systems and technologists are proactively involved in data and information analytics. Practical implications Academic and research information organizations are powerhouses that provide knowledge to support research, teaching and learning for sustainable development and the betterment of humanity and society. Innovation of internet of things and associated technologies provides practical aspects of attaining sustainable information development practices in the contemporary knowledge society. Internet of things technologies, principles of economies of scale and investment and customer needs entail that information organizations and practitioners should provide appropriate and smart systems and solutions. Social implications Modern academic and research information organizations have the social corporate responsibility to offer technological innovations to heighten access to knowledge and learning in academic and research institutions. Economically, innovation and application of internet of things provide unlimited access to big data and information in organizations all the time anywhere anytime. Originality/value Data management is a growing phenomenon that information practitioners need to fully understand in the digital economies. Information professionals need to embrace and appreciate innovation and application of internet of things technologies whose role in sustainable development practices is critical in academic and research organizations.
APA, Harvard, Vancouver, ISO, and other styles
3

Weli, Weli. "Re-examination and expanding the EUCS Model on Cloud-based ERP system." Journal of information and organizational sciences 45, no. 1 (June 15, 2021): 115–35. http://dx.doi.org/10.31341/jios.45.1.7.

Full text
Abstract:
The end-user computing satisfaction model (EUCS) has been widely used in previous studies, and Enterprise Resource Planning. Therefore, Enterprise Resource Planning (ERP) System, need to be developed in accordance with cloud computing that dominate current information technology devices. This study was carried out to test the expansion of the EUCS model in a cloud-based ERP system. Therefore, the purpose of this research is to re-examine the validity and reliability of the computer application satisfaction model and its relationship with user performance in cloud-based ERP system. The overall satisfaction variable is added as a mediation between the satisfaction model and user performance. Data was collected through snowball sampling with a questionnaire distributed to the cloud-based ERP users. Additionally, data processing was conducted using the Second-order concept in structural equation modeling with the Partial Least Square approach. Since data processing using WarpPLS confirmed the validity and reliability of the model and all relationships between variables, this research contributes theoretically to the study of end-user satisfaction from information technology applications. Its final section describes the limitations and opportunities for future research.
APA, Harvard, Vancouver, ISO, and other styles
4

Meshcheryakova, N. N. "Methodology for cognition of digital society." Digital Sociology 3, no. 2 (July 28, 2020): 17–26. http://dx.doi.org/10.26425/2658-347x-2020-2-17-26.

Full text
Abstract:
Digital sociology is a computational social science that uses modern information systems and technologies, has already formed. But the conflict with traditional sociology and its research methods has not yet been resolved. This conflict can be overcome if we remember that there is a common goal – the knowledge of the phenomena and processes of social life, which is primary in relation to the methods to be agreed upon. Digital transformation of sociology is essential, since 1) traditional sociological methods do not solve the problem of providing voluminous, reliable empirical data qualitatively and in a short time; 2) the transition from contact research methods to unobtrusive ones is in demand. The adaptation of four modern information technologies-cloud computing, big data, the Internet of things and artificial intelligence – for the purposes of sociology provides a qualitative transition in the methodology of knowledge of the digital society. Cloud computing provide researchers with tools, big data – research materials, Internet of things technology aimed at collecting indicators (receiving signals) in large volume, in real time, as direct, not indirect evidence of human behavior. The development of “artificial intelligence” technology expands the possibility of receiving processed signals of the quality of the social system without building a preliminary hypothesis, in a short time and on a large volume of processed data. Digital transformation of sociology does not mean abandoning the use of traditional methods of sociological analysis, but it involves expanding the competence of a sociologist, which requires a revision of University curricula. At the same time, combining the functions of an expert on the subject (sociologist) and data analyst in one specialist is assessed as unpromising, it is proposed to combine their professional competencies in working on unified research projects.
APA, Harvard, Vancouver, ISO, and other styles
5

Harris, Roger. "Association of Computing Machinery Special Interest Group in Computer Personnel Research. Annual Conference, April 1992, Cincinnati OH, USA." Journal of Information Technology 8, no. 2 (June 1993): 121–23. http://dx.doi.org/10.1177/026839629300800208.

Full text
Abstract:
The conference highlighted the increasing complexity of the role of IS personnel. On the one hand, the demands of increased competitiveness are forcing technical experts to gain better understanding of the commercial requirements of the end users they serve, and on the other hand, the opportunities offered by the End-User Computing phenomenon are placing increasing demands on the technical capabilities of the end users themselves. The emerging picture is one of a highly dynamic IS profession, with expanding boundaries, fewer barriers between itself and other professions and offering greater opportunities for those entering it and increased challenges for those already in it.
APA, Harvard, Vancouver, ISO, and other styles
6

Matos, Ecivaldo De Souza, and Fábio Correia de Rezende. "Raciocínio computacional no ensino de língua inglesa na escola: um relato de experiência na perspectiva BYOD (Computational thinking to teaching English in high school: an experience report in the BYOD perspective)." Revista Eletrônica de Educação 14 (November 6, 2019): 3116073. http://dx.doi.org/10.14244/198271993116.

Full text
Abstract:
Computational Thinking (CT) is a set of logical-operational cognitive skills or processes of reasoning, based on Computer Science. Abstraction, pattern recognition, algorithmic reasoning, and decomposition are examples of some of these skills that form the four pillar of CT. Some researchers have considered these skills as useful, and even mandatory to to cognitive development of the schoolchildren. In this paper, we present practical aspects and the possible contributions of CT in the development of competence of reading and interpreting English texts. Didactic interventions were carried out in high school classes of a public school, supported by the Bring Your Own Device (BYOD) approach, in which the students used their own smartphones. During these interventions, the students developed concept maps and podcasts, performed online exercises and the traditional exam, all of that composed the set of evaluation instruments. It was possible to understand that the CT skills are intrinsically present and contributed to the development of the reading and writing skills in English. According to testimonials, we highlight that the BYOD approach provided new conceptions and perspectives on the use of electronic equipment in function of the students’ learning.ResumoO Raciocínio Computacional (RC) é um conjunto de habilidades ou processos cognitivos lógico-operacionais de raciocínio, fundamentadas na Ciência da Computação. Abstração, reconhecimento de padrões, raciocínio algorítmico e decomposição são exemplos de algumas dessas habilidades que formam os quatro pilares do RC. Alguns pesquisadores consideram essas habilidades úteis, e até mesmo fundamentais, para o desenvolvimento cognitivo dos estudantes. Nesse sentido, este relato de experiência tem por objetivo apresentar aspectos práticos e possíveis contribuições do RC no desenvolvimento da competência de leitura e interpretação de textos de diferentes naturezas na disciplina de língua inglesa. Para isso, realizaram-se intervenções didáticas em uma turma do ensino médio de uma escola pública, apoiadas na abordagem Bring Your Own Device ou, simplesmente, BYOD, em que os estudantes usaram seus próprios aparelhos celulares. Durante o desenvolvimento das intervenções, os estudantes construíram mapas conceituais e podcasts, realizarem exercício online e a tradicional prova, os quais compuseram o conjunto de instrumentos avaliativos do bimestre. Por meio dessas intervenções, foi possível identificar como as habilidades do RC estiveram intrinsecamente presentes e contribuíram para o desenvolvimento da competência de leitura e escrita em língua inglesa, elencada pelos Parâmetros Curriculares Nacionais. Conforme relatos, além da articulação didática com o RC, a abordagem BYOD proporcionou à professora e aos estudantes novas concepções e perspectivas sobre o uso de equipamentos eletrônicos em função da aprendizagem deles mesmos.Palavras-chave: Raciocínio computacional, Ensino de inglês, Mobile learning, Educação em computação.Keywords: Computational thinking, English teaching, Mobile learning, Computer science education.ReferencesALBERTA Education. School Technology Branch. Bring your own device: a guide for schools. 2012. Disponível em:http://education.alberta.ca/admin/technology/research.aspx. Acesso em: 01 fev. 2017.ALLAN, Walter; COULTER, Bob; DENNER, Jill; ERICKSON, Jeri; LEE, Irene; MALYN-SMITH, Joyce; MARTIN, Fred. Computational thinking for youth. White Paper for the ITEST Learning Resource Centre na EDC. Small Working Group on Computational Thinking (CT), 2010. Disponível em: http://stelar.edc.org/publications/computational-thinking-youth. Acesso em: dez 2017.ARAÚJO, Ana Liz; ANDRADE, Wilkerson; GERRERO, Dalton Serey. Pensamento Computacional sob a visão dos profissionais da computação: uma discussão sobre conceitos e habilidades. In: Anais dos Workshops do VI Congresso Brasileiro de Informática na Educação. v. 4, n 1, 2015. p. 1454-1563.ARMONI, Michal. Computing in schools: On teaching topics in computer science theory. ACM Inroads, v. 1, n. 1, p. 21-22. 2010. DOI=http://dx.doi.org/10.1145/1721933.1721941BARBOSA, Márcio Lobo; ALVES, Álvaro Santos; JESUS, José Carlos Oliveira; BURNHAM, Teresinha Fróes. Mapas conceituais na avaliação da aprendizagem significativa. In: Anais do XVI Simpósio Nacional de Ensino de Física, v. 14, 2005, p. 1-4.BELL, Tim; WITTEN, Ian; FELLOWS, Mike. Ensinando Ciência da Computação sem o uso do computador. Computer Science Unplugged, 2011.BOCCONI, Stefania; CHIOCCARIELLO, Augusto; DETTORI, Giuliana; FERRARI, Anusca; ENGELHARDT, Katja. Developing computational thinking in compulsory education Implications for policy and practice. European Commission, JRC Science for Policy Report. 2016.BRASIL, Ministério da Educação. Secretaria da Educação Básica. PCN+ ensino médio: Orientações educacionais complementares aos parâmetros curriculares nacionais, Brasília: MEC. 2002. Disponível em: http://portal.mec.gov.br/seb/arquivos/pdf/linguagens02.pdf. Acesso em: set 2017.BRASIL. Ministério da Educação (MEC). Base Nacional Comum Curricular. 2017. Disponível em: http://basenacionalcomum.mec.gov.br/. Acesso em: set 2017.BRITANNICA, Encyclopaedia. Phenol: Encyclopaedia Britannica Online Academic Edition. Encyclopædia Britannica Inc. 2012. Disponível em: https://www.britannica.com/. Acesso em: 01 fev. 2017.BROOKSHEAR, J-Glenn. Ciência da Computação: uma visão abrangente. Porto Alegre, Bookman Editora, 2005.CHARLTON, Patricia; LUCKIN, Rosemary. Computational thinking and computer science in schools. What The Research Says’ Briefing, v. 2. 2012. [s.p.]CHIOFI, Luiz Carlos; OLIVEIRA, Marta Regina Furlan de. O uso das tecnologias educacionais como ferramenta didática no processo de ensino e aprendizagem. In: Anais da III Jornada de Didática - Jornada de Didática: Desafios para a Docência e II Seminário de Pesquisa do CEMAD. Londrina, 2014. [s.p.]COMPUTER AT SCHOOL. Computational Thinking: a guide for teachers. Hodder Education - the educational division of Hachette UK Digital Schoolhouse, 2015. Disponível em: https://community.computingatschool.org.uk/resources/2324/single. Acesso em: 01 set 2017.CORREIA, Paulo Rogério Miranda; SILVA, Amanda Cristina; ROMANO JÚNIOR, Jerson Geraldo. Mapas conceituais como ferramenta de avaliação na sala de aula. Revista Brasileira de Ensino de Física, v. 32, n. 4, p. 4402-4408. 2010.COSTA, Giselda dos Santos. Mobile learning: explorando potencialidades com o uso do celular no ensino-aprendizagem de língua inglesa como língua estrangeira com alunos da escola pública. 2013. 201f. Tese (Doutorado em Letras). Faculdade de Letras. Universidade Federal de Pernambuco. Recife. 2013.CSIZMADIA, Andrew; SENTANCE, Sue. Teachers’ perspectives on successful strategies for teaching Computing in school. In: IFIP TCS. 2015. Disponível em: <http://community.computingatschool.org.uk/files/6769/original.pdf>. Acesso em março 2018.CSIZMADIA, Andrew; CURZON, Paul; DORLING, Mark; HUMPHREYS, Simon; NG, Thomas; SELBY, Cynthia; WOOLLARD, John. Computational thinking: A guide for teachers. Computing at Schools, 2015. Disponível em: https://community.computingatschool.org.uk/files/8550/original.pdf>. Acesso em: 26 out. 2017.DIAS, Reneildes; JUCÁ, Leina; FARIA, Raquel. High Up: ensino médio. Cotia, SP: Macmillan, 2013.GOOGLE FOR EDUCATION. What is Computational Thinking? Computational Thinking for Educators. 2015. Disponível em: <https://computationalthinkingcourse.withgoogle.com/unit?lesson=8&unit=1. Acesso em: set 2017.LEE, Irene; MARTIN, Fred; DENNER, Jill; COULTER, Bob; ALLAN, Walter; ERICKSON, Jeri; MALYN-SMITH, Joyce; WERNER, Linda. Computational thinking for youth in practice. ACM Inroads, v. 2, n. 1, 2011. p. 32-37.LIUKAS, Linda. Hello Ruby: adventures in coding. New York: Feiwel & Friends, 2015.LU, Zhao.; YING, Lu. Application of Podcast in Teaching and Learning Oral English for Non-English Majors. In: International Conference on Computational and Information Sciences, Shiyang, 2013. p. 1935-1938. doi: 10.1109/ICCIS.2013.506MANNILA, Linda; VALENTINA, Dagiene; DEMO, Barbara; GRGURINA, Natasa; MIROLO, Claudio; ROLANDSSON, Lennart; SETTLE, Amber. Computational thinking in K-9 education. In: Proceedings of the working group reports of the 2014 on innovation & technology in computer science education conference. ACM, 2014. p. 1-29.MOREIRA, Antonio Marco. Mapas conceituais e aprendizagem significativa (concept maps and meaningful learning). Cadernos do Aplicação, v. 11, n. 2, 1998. p. 143-156.NCSEC. Team 11 in 2000. Concept map. 2000. National Computation Science Education Consortium Disponível em: <http://www.ncsec.org/team11/ Rubric Concep tMap.doc>. Acesso em: set. 2017.NOVAK, Joseph. D. Meaningful learning: The essential factor for conceptual change in limited or inappropriate propositional hierarchies leading to empowerment of learners. Science education, Wiley Online Library, v. 86, n. 4, 2002. p. 548-571.NOVAK, Joseph. Learning creating and using knowledge: Concept maps as facilitative tools in schools and corporations. [S.l.]: Routledge, 2010.PAIVA, Luiz Fernando; FERREIRA, Ana Carolina; ROCHA, Caio; BARRETO, Jandiaci; MELHOR, André; LOPES, Randerson; MATOS, Ecivaldo. Uma experiência piloto de integração curricular do raciocínio computacional na educação básica. In: Anais dos Workshops do Congresso Brasileiro de Informática na Educação, v. 4, 2015. p. 1300-1309.RACHID, Laura. Cenário da educação básica no Brasil é alarmante, aponta Ideb. Revista Educação. São Paulo, 04 set. 2018. Disponível em: http://www.revistaeducacao.com.br/cenario-da-educacao-basica-no-brasil-e-alarmante/. Acesso em: 26 de setembro de 2018.RODRIGUEZ, Carla; ZEM-LOPES, Aparecida Maria; MARQUES, Leonardo; ISOTANI, Seiji. Pensamento Computacional: transformando ideias em jogos digitais usando o Scratch. In: Anais do Workshop de Informática na Escola. p. 62-71. 2015.SILVA, Edson Coutinho. Mapas conceituais: propostas de aprendizagem e avaliação. Administração: Ensino e Pesquisa, [S.l.], v. 16, n. 4, p. 785-815, dez. 2015. ISSN 2358-0917. Disponível em: <https://raep.emnuvens.com.br/raep/article/view/385/196>. Acesso em: 06 nov. 2017. doi: https://doi.org/10.13058/raep.2015.v16n4.385.SILVA, Edson Coutinho. Mapas Conceituais: Modelos de Avaliação. Concept Mapping to Learn and Innovate. In: Proc. of Sixth Int. Conference on Concept Mapping. Santos, Brazil. 2014.WING, Jannette. Computational thinking. Communications of the ACM, v. 49, n. 3, p. 33-35, 2006.WING, Jannette. Computational thinking and thinking about computing. Philosophical transactions of the royal society of London A: mathematical, physical and engineering sciences, v. 366, n. 1881, 2008. p. 3717-3725.XU, Zhichang. Problems and strategies of teaching English in large classes in the People's Republic of China. In: Expanding Horizons in Teaching and Learning. Proceedings of the 10th Annual Teaching Learning Forum. 2001. p. 7-9.ZORZO, Avelino Francisco; RAABE, André Luís Alice; BRACKMANN, Christian Puhlmann. Computação: o vetor de transformação da sociedade. In: FOGUEL, D.; SCHEUENSTUHL, M. C. B. Desafios da Educação Técnico-Científica no Ensino Médio. Rio de Janeiro: Academia Brasileira de Ciências, 2018. p. 154-163.e3116073
APA, Harvard, Vancouver, ISO, and other styles
7

Hamner, Marvine, and Raza-ur-Rehman Qazi. "Expanding the Technology Acceptance Model to examine Personal Computing Technology utilization in government agencies in developing countries." Government Information Quarterly 26, no. 1 (January 2009): 128–36. http://dx.doi.org/10.1016/j.giq.2007.12.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hew, Teck-Soon, and Sharifah Latifah Syed Abdul Kadir. "Predicting instructional effectiveness of cloud-based virtual learning environment." Industrial Management & Data Systems 116, no. 8 (September 12, 2016): 1557–84. http://dx.doi.org/10.1108/imds-11-2015-0475.

Full text
Abstract:
Purpose Cloud computing technology is advancing and expanding at an explosive rate. These advancements have further extended the capabilities of the virtual learning environment (VLE) to provide accessibility anywhere, anytime where educational resources can be saved, modified, retrieved and shared on the cloud. The purpose of this paper is to examine the predictors of instructional effectiveness of cloud computing VLE by extending the Self Determination and Channel Expansion Theory with external constructs of VLE interactivity, content design, school support, trust in website, knowledge sharing attitude and demographic variables. Design/methodology/approach Random sampling data were collected in two waves of nation-wide survey and analyzed with artificial neural network approach. Findings SDT, CET, content design, interactivity, trust in website, school support and demographics significantly predict instructional effectiveness. Research limitations/implications The study has provided a new paradigm shift from investigating the behavioral intention and continuance intention to the effectiveness of an information system. It advocates that quality of research may be improved by adhering to the basic research methodology starting from rigorous instrument development and validation to future research direction. Practical implications The research provides implications to Ministry of Education, the VLE content and service providers, scholars and practitioners. Social implications The findings of the study may further improve the quality of living of the society when the instructional effectiveness of the cloud-based VLE is further enhanced. Originality/value Existing grid computing VLE studies have focussed on the acceptance of students and teachers and not its instructional effectiveness. Unlike existing studies that examined extrinsic motivational factors (e.g. TAM, UTAUT), this study uses intrinsic motivational factors (e.g. relatedness, competence and autonomy) as well as perceived media richness. Malaysia is the first nation to implement the VLE at a national scale and the findings from this study will provide a new insight on the determinants of instructional effectiveness of the VLE system.
APA, Harvard, Vancouver, ISO, and other styles
9

Khalimova, Sophia, and Anastasiya Ivanova. "Labor Productivity of Economic Sectors in the Regions: The Role of Information and Communication Technologies." Spatial Economics 17, no. 4 (2021): 69–96. http://dx.doi.org/10.14530/se.2021.4.069-096.

Full text
Abstract:
The attention of this article is focused on the impact that expanding of the usage of information and communication technologies (ICT) has on the economic development of Russian regions. As shown by various authors, the use of ICT ultimately leads to an increase in the factor productivity. Here, we assess to what extent the use of ICT contributes to the growth of economic development efficiency at the regional level, which is interpreted here as labor productivity in certain economic sectors, and measured as output per worker. Panel data analysis for Russian regions covers 2015–2018. The analysis shows that the spread of ICT has a positive effect on labor productivity in both mining and manufacturing, with dividing regions into two groups – ‘resource’ and ‘non-resource’ depending on role of the extractive industry in regional economy – when considering labor productivity in mining. It was found that there is a relationship between ICT development indicators and labor productivity, with significant factors being industry and regionally specific. The widespread adoption of ICT has a positive effect on the economic development effectiveness, with a stronger link in the manufacturing, while for the mining the discovered relation was not so clear. Among the factors affecting labor productivity in the mining in ‘resource’ regions are access to the Internet, the use of ‘cloud’ services, as well as involvement in research and development; for ‘non-resource’ regions significant factors are the use of local computer networks, regional ICT subsidies, and the purchase of computing equipment. For the manufacturing, the key factors are access to the Internet, the share of high-tech businesses in the regional economy, the purchase of computing equipment, and the use of the services of third-party organizations and ICT specialists
APA, Harvard, Vancouver, ISO, and other styles
10

Al-Madhagy Taufiq-Hail, Ghilan, Ayed Rheal A. Alanzi, Shafiz A Mohd Yusof, and Madallah M Alruwaili. "Software as a Service (SaaS) Cloud Computing: An Empirical Investigation on University Students’ Perception." Interdisciplinary Journal of Information, Knowledge, and Management 16 (2021): 213–53. http://dx.doi.org/10.28945/4740.

Full text
Abstract:
Aim/Purpose: This study aims to propose and empirically validate a model and investigates the factors influencing acceptance and use of Software as a Services cloud computing services (SaaS) from individuals’ perspectives utilizing an integrative model of Theory of Planned Behavior (TPB) and Technology Acceptance Model (TAM) with modifications to suit the objective of the study. Background: Even though SaaS cloud computing services has gained the acceptance in its educational and technical aspects, it is still expanding constantly with emerging cloud technologies. Moreover, the individual as an end-user of this technology has not been given the ample attention pertaining to SaaS acceptance and adoption (AUSaaS). Additionally, the higher education sector needs to be probed regarding AUSaaS perception, not only from a managerial stance, but also the individual. Hence, further investigation in all aspects, including the human factor, deserves deeper inspection. Methodology: A quantitative approach with probability multi-stage sampling procedure conducted utilizing survey instrument distributed among students from three public Malaysian universities. The valid collected responses were 289 Bachelor’s degree students. The survey included the demographic part as well as the items to measure the constructs relationships hypothesized. Contribution: The empirical results disclosed the appropriateness of the integrated model in explaining the individual’s attitude (R2 = 57%), the behavior intention (R2 = 64%), and AUSaaS at the university settings (R2 = 50%). Also, the study offers valuable findings and examines new relationships that considered a theoretical contribution with proven empirical results. That is, the subjective norms effect on attitude and AUSaaS is adding empirical evidence of the model hypothesized. Knowing the significance of social effect is important in utilizing it to promote university products and SaaS applications – developed inside the university – through social media networks. Also, the direct effect of perceived usefulness on AUSaaS is another important theoretical contribution the SaaS service providers/higher education institutes should consider in promoting the usefulness of their products/services developed or offered to students/end-users. Additionally, the research contributes to the knowledge of the literature and is considered one of the leading studies on accepting SaaS services and applications as proliferation of studies focus on the general and broad concept of cloud computing. Furthermore, by integrating two theories (i.e., TPB and TAM), the study employed different factors in studying the perceptions towards the acceptance of SaaS services and applications: social factors (i.e., subjective norms), personal capabilities and capacities (i.e., perceived behavioral control), technological factors (i.e., perceived usefulness and perceived ease of use), and attitudinal factors. These factors are the strength of both theories and utilizing them is articulated to unveil the salient factors affecting the acceptance of SaaS services and applications. Findings: A statistically positive significant influence of the main TPB constructs with AUSaaS was revealed. Furthermore, subjective norms (SN) and perceived usefulness (PU) demonstrated prediction ability on AUSaaS. Also, SN proved a statically significant effect on attitude (ATT). Specifically, the main contributors of intention are PU, perceived ease of use, ATT, and perceived behavioral control. Also, the proposed framework is validated empirically and statistically. Recommendation for Researchers: The proposed model is highly recommended to be tested in different settings and cultures. Also, recruiting different respondents with different roles, occupations, and cultures would likely draw more insights of the results obtained in the current research and its generalizability Future Research: Participants from private universities or other educational institutes suggested in future work as the sample here focused only on public sector universities. The model included limited number of variables suggesting that it can be extended in future works with other constructs such as trialability, compatibility, security, risk, privacy, and self-efficacy. Comparison of different ethnic groups, ages, genders, or fields of study in future research would be invaluable to enhance the findings or reveal new insights. Replication of the study in different settings is encouraged.
APA, Harvard, Vancouver, ISO, and other styles
11

Lan, Hai, Xinshi Zheng, and Paul M. Torrens. "Spark Sensing: A Cloud Computing Framework to Unfold Processing Efficiencies for Large and Multiscale Remotely Sensed Data, with Examples on Landsat 8 and MODIS Data." Journal of Sensors 2018 (August 23, 2018): 1–12. http://dx.doi.org/10.1155/2018/2075057.

Full text
Abstract:
Inquiry using data from remote Earth-observing platforms often confronts a straightforward but particularly thorny problem: huge amounts of data, in ever-replenishing supplies, are available to support inquiry, but scientists’ agility in converting data into actionable information often struggles to keep pace with rapidly incoming streams of data that amass in expanding archival silos. Abstraction of those data is a convenient response, and many studies informed purely by remotely sensed data are by necessity limited to a small study area with a relatively few scenes of imagery, or they rely on larger mosaics of images at low resolution. As a result, it is often challenging to thread explanations across scales from the local to the global, even though doing so is often critical to the science under pursuit. Here, a solution is proposed, by exploiting Apache Spark, to implement parallel, in-memory image processing with ability to rapidly classify large volumes of multiscale remotely sensed images and to perform necessary analysis to detect changes on the time series. It shows that processing on three different scales of Landsat 8 data (up to ~107.4 GB, five-scene, time series image sets) can be accomplished in 1018 seconds on local cloud environment. Applying the same framework with slight parameter adjustments, it processed same coverage MODIS data in 54 seconds on commercial cloud platform. Theoretically, the proposed scheme can handle all forms of remote sensing imagery commonly used in the Earth and environmental sciences, requiring only minor adjustments in parameterization of the computing jobs to adjust to the data. The authors suggest that the “Spark sensing” approach could provide the flexibility, extensibility, and accessibility necessary to keep inquiry in the Earth and environmental sciences at pace with developments in data provision.
APA, Harvard, Vancouver, ISO, and other styles
12

Goodman, James A., Mui Lay, Luis Ramirez, Susan L. Ustin, and Paul J. Haverkamp. "Confidence Levels, Sensitivity, and the Role of Bathymetry in Coral Reef Remote Sensing." Remote Sensing 12, no. 3 (February 4, 2020): 496. http://dx.doi.org/10.3390/rs12030496.

Full text
Abstract:
Remote sensing is playing an increasingly important role in the monitoring and management of coastal regions, coral reefs, inland lakes, waterways, and other shallow aquatic environments. Ongoing advances in algorithm development, sensor technology, computing capabilities, and data availability are continuing to improve our ability to accurately derive information on water properties, water depth, benthic habitat composition, and ecosystem health. However, given the physical complexity and inherent variability of the aquatic environment, most of the remote sensing models used to address these challenges require localized input parameters to be effective and are thereby limited in geographic scope. Additionally, since the parameters in these models are interconnected, particularly with respect to bathymetry, errors in deriving one parameter can significantly impact the accuracy of other derived parameters and products. This study utilizes hyperspectral data acquired in Hawaii in 2000–2001 and 2017–2018 using NASA’s Classic Airborne Visible/Infrared Imaging Spectrometer to evaluate performance and sensitivity of a well-established semi-analytical inversion model used in the assessment of coral reefs. Analysis is performed at several modeled spatial resolutions to emulate characteristics of different feasible moderate resolution hyperspectral satellites, and data processing is approached with the objective of developing a generalized, scalable, automated workflow. Accuracy of derived water depth is evaluated using bathymetric lidar data, which serves to both validate model performance and underscore the importance of image quality on achieving optimal model output. Data are then used to perform a sensitivity analysis and develop confidence levels for model validity and accuracy. Analysis indicates that derived benthic reflectance is most sensitive to errors in bathymetry at shallower depths, yet remains significant at all depths. The confidence levels provide a first-order method for internal quality assessment to determine the physical extent of where and to what degree model output is considered valid. Consistent results were found across different study sites and different spatial resolutions, confirming the suitability of the model for deriving water depth in complex coral reef environments, and expanding our ability to achieve automated widespread mapping and monitoring of global coral reefs.
APA, Harvard, Vancouver, ISO, and other styles
13

Zhang, Meimei, Fang Chen, Hang Zhao, Jinxiao Wang, and Ning Wang. "Recent Changes of Glacial Lakes in the High Mountain Asia and Its Potential Controlling Factors Analysis." Remote Sensing 13, no. 18 (September 19, 2021): 3757. http://dx.doi.org/10.3390/rs13183757.

Full text
Abstract:
The current glacial lake datasets in the High Mountain Asia (HMA) region still need to be improved because their boundary divisions in the land–water transition zone are not precisely delineate, and also some very small glacial lakes have been lost due to their mixed reflectance with backgrounds. In addition, most studies have only focused on the changes in the area of a glacial lake as a whole, but do not involve the actual changes of per pixel on its boundary and the potential controlling factors. In this research, we produced more accurate and complete maps of glacial lake extent in the HMA in 2008, 2012, and 2016 with consistent time intervals using Landsat satellite images and the Google Earth Engine (GEE) cloud computing platform, and further studied the formation, distribution, and dynamics of the glacial lakes. In total, 17,016 and 21,249 glacial lakes were detected in 2008 and 2016, respectively, covering an area of 1420.15 ± 232.76 km2 and 1577.38 ± 288.82 km2; the lakes were mainly located at altitudes between 4400 m and 5600 m. The annual areal expansion rate was approximately 1.38% from 2008 to 2016. To explore the cause of the rapid expansion of individual glacial lakes, we investigated their long-term expansion rates by measuring changes in shoreline positions. The results show that glacial lakes are expanding rapidly in areas close to glaciers and had a high expansion rate of larger than 20 m/yr from 2008 to 2016. Glacial lakes in the Himalayas showed the highest expansion rate of more than 2 m/yr, followed by the Karakoram Mountains (1.61 m/yr) and the Tianshan Mountains (1.52 m/yr). The accelerating rate of glacier ice and snow melting caused by global warming is the primary contributor to glacial lake growth. These results may provide information that will help in the understanding of detailed lake dynamics and the mechanism, and also facilitate the scientific recognition of the potential hazards associated with glacial lakes in this region.
APA, Harvard, Vancouver, ISO, and other styles
14

Descals, Adrià, Zoltan Szantoi, Erik Meijaard, Harsono Sutikno, Guruh Rindanata, and Serge Wich. "Oil Palm (Elaeis guineensis) Mapping with Details: Smallholder versus Industrial Plantations and their Extent in Riau, Sumatra." Remote Sensing 11, no. 21 (November 5, 2019): 2590. http://dx.doi.org/10.3390/rs11212590.

Full text
Abstract:
Oil palm is rapidly expanding in Southeast Asia and represents one of the major drivers of deforestation in the region. This includes both industrial-scale and smallholder plantations, the management of which entails specific challenges, with either operational scale having its own particular social and environmental challenges. Although, past studies addressed the mapping of oil palm with remote sensing data, none of these studies considered the discrimination between industrial and smallholder plantations and, furthermore, between young and mature oil palm stands. This study assesses the feasibility of mapping oil palm plantations, by typology (industrial versus smallholder) and age (young versus mature), in the largest palm oil producing region of Indonesia (Riau province). The impact of using optical images (Sentinel-2) and radar scenes (Sentinel-1) in a Random Forest classification model was investigated. The classification model was implemented in a cloud computing system to map the oil palm plantations of Riau province. Our results show that the mapping of oil palm plantations by typology and age requires a set of optimal features, derived from optical and radar data, to obtain the best model performance (OA = 90.2% and kappa = 87.2%). These features are texture images that capture contextual information, such as the dense harvesting trail network in industrial plantations. The study also shows that the mapping of mature oil palm trees, without distinction between smallholder and industrial plantations, can be done with high accuracy using only Sentinel-1 data (OA = 93.5% and kappa = 86.9%) because of the characteristic backscatter response of palm-like trees in radar scenes. This means that researchers, certification bodies, and stakeholders can adequately detect mature oil palm stands over large regions without training complex classification models and with Sentinel-1 features as the only predictive variables. The results over Riau province show that smallholders represent 49.9% of total oil palm plantations, which is higher than reported in previous studies. This study is an important step towards a global map of oil palm plantations at different production scales and stand ages that can frequently be updated. Resulting insights would facilitate a more informed debate about optimizing land use for meeting global vegetable oil demands from oil palm and other oil crops.
APA, Harvard, Vancouver, ISO, and other styles
15

Mazieri, Marcos Rogério, Isabel Cristina Scafuto, and Priscila Rezende Da Costa. "Tokenization, blockchain and web 3.0 technologies as research objects in innovation management." International Journal of Innovation 10, no. 1 (March 17, 2022): 1–5. http://dx.doi.org/10.5585/iji.v10i1.21768.

Full text
Abstract:
The e-mail allegedly attributed to Satoshi Nakamoto (supposedly a pseudonym) was transmitted 14 years ago, describing the development of an electronic currency (Nakamoto, 2008). The design of this electronic currency represented the solution of the general Byzantine problem, a well-known problem in computing, which, in general terms, defines that one of the parts of a system can intentionally fail, and with that, make the entire network unavailable. Therefore, the premise is that part of the system is corrupt (Dolev et al., 1982). In the few lines of the email, Satoshi Nakamoto described such a solution and published an article with the details made available on the same date. The article describes how to transmit information within a chain of blocks that are: synchronized with date and time (time stamp); combined with code that depends on a previous block (hash code); can be validated with public and private key cryptography framework anonymously and decentrally; but highly resilient to any tampering attempt and with public record. The concept of digital currency, in this case Bitcoin, consisted at that time of a code or token resulting from encryption and that could be included in these blocks. Blocks registered definitively in the ledgers distributed along the blockchain network that could be traced. The digital framework developed by Satoshi Nakamoto, although it emerged to make Bitcoin viable as a digital currency, has been separated over the last 14 years. Blockchain can be understood as a decentralized communication technology that gave rise to a family of other technological structures of encrypted communication such as ecosystems, public blockchain, private blockchain and blockchain networks, mainly (Mazumdar Ruj, 2022). Digital currencies, on the other hand, have also developed in variety and quantity, so much so that as we write this editorial there are over 10,000 digital currencies in operation. The total capitalization value of digital currencies rose from USD 18 billion at the beginning of 2017, surpassing USD 1.4 trillion by mid-2021 (Su et al., 2022). Currently, there is no technological impediment for companies to create their own digital currencies using a Bitcoin network or an Etherium network, for example, as well as many other networks available.Obviously, even today, there are technical challenges related, mainly, to the scalability of these networks and currencies. Bitcoin, when created, had a capacity of 7 transactions per second, currently, as we write this editorial, the transaction capacity of the Bitcoin network (BTS) is 14 transactions per second. The Etherium (ETH) network was born with a capacity of 20 transactions per second and currently has a capacity of 35 transactions per second. For comparison purposes, the VISA network has a capacity of 1700 transactions per second, which shows that there is still some way to make blockchain networks the new communication backbone, scalable for more mass uses (Chauhan Patel, 2022). There are implementations of the Solana network, for example, which promises to reach 50,000 transactions per second, still in the confirmation phase from a practical point of view, which could allow running Internet of Things (IoT) applications on this blockchain network (Duffy et al., 2021).At the same time, since 2013, the reorganization of the TCP IP structure from IPv4 (4.2 billion IP addresses) to IPv6 (79 octillion IP addresses or 7.9 x ) more than the total number of IPv4 addresses) has been implemented. Such implementation made it possible to expand connectivity to a level sufficient for the world demand, which is 56 octillion (56 x ) addresses per human being on earth. In terms of addressing, the possibilities of connecting new and future elements on the internet/blockchain communication network are guaranteed, making the IoT (Internet of Things) a real possibility.In addition to the traditional applications dedicated to making digital currency viable, especially in the last 5 years, certain works resulting from the combination of information technology and human creativity (also known as creative economy) brought NFT (Non-Fungible Token) to the management field. NFT are tokens (produced through encrypted code, subscribed in some blockchain network) that express the ownership of their author. Whoever acquires an NFT, has his/her record recorded in a ledger and, therefore, can exercise the rights or benefits related to the possession of that NFT. There are two main origins of an NFT, digital games and works of art or graphic expressions (Vasan et al., 2022). In the case of digital games, NFT can be used to record permanently and nominally the “achievements achieved” within a given game. Its owner now takes possession of a certain item that, previously, would only exist within the game itself, a virtual (digital) environment. In the case of graphic, artistic expressions, and other works of art, it is possible to make your possession digital. Works from the natural environment (physical), the result of expressions of human creativity, are now registered in an NFT-type token, coming to exist in the virtual world (digital). In this way, the works, and the data of their authorship and ownership, are permanently registered in the ledger of a blockchain network specialized in transacting NFT. As in the game, the possession of an NFT of a work of art allows the author to trade or use the benefits related to the possession of this NFT.From the convergence of connectivity technologies such as cloud computing, the advent of IPV6 and technologies based on tokens (blockchain, crypto assets and NFT not exhaustively) the concept of Web 3.0 becomes viable. Web 3.0 can be understood as a network of people and physical objects, making the integration between the natural world and the virtual world more intense (augmented, virtual and mixed reality). The idea of a Metaverse (Web 3.0 Application) depends on the technological availability that we describe here very succinctly and on the realization of new social behaviors that are underway (Korkmaz et al., 2022).The context described is not new to most practitioners and academics involved with innovation. However, by describing it in general terms, we can identify different research objects that may be of interest to the community working in the field of innovation management. Evidently, within the research perspectives, especially in innovation management, parallel logics can be established with the more established theories or concepts, which allow an approximation with the new technological objects available to people and companies. Such technologies have permeated traditional companies and startups that have a specific focus on these connectivity technologies described as core business or as business support.The idea of this editorial comment is to recognize the possibility of receiving more technological articles or scientific articles, perspectives and book reviews that consider connectivity and tokenization technologies as research objects. Such technologies can be positioned in research both as objects of analysis and as contextual and organizational objects. Whether contextual and organizational can bring research involving routines, capabilities, competencies and business models, whose core business process is innovation at different scales, natures, degrees of novelty, stages of diffusion or adoption. To cite just one possibility, as an example, the model by Tidd and Bessant from 2009, which describes the construct of orientation to innovation strategy, used in several research in the field of innovation since then, can be revised in the new contexts or in the face of new technologies (Ferreira et al., 2015). If such technologies are positioned as objects of analysis, research can involve every part of the innovation management process such as searching for innovations, selecting innovations, implementing innovations, generating value with innovations, and capturing value with innovations in analysis of single level or multilevel. In addition to the direct positioning of token and blockchain-based technologies, as an object or as a contextual aspect, adjacent effects are expected, for instance, involving intellectual property, environmental and social sustainability, technological governance, people management and other consequences that may be the focus of research, considering the emerging technologies mentioned above. There is also the field of research that is dedicated to the development of new products, both defining new models of digital product development and methods derived from these models, without forgetting all the implications related to the issues of information security management involved in these contexts of token transactions (Baudier et al., 2022). Although the possibilities for theoretical and managerial development for the area of innovation research, involving technologies based on tokens and blockchain, are broad, there is research that can be very relevant, but that would be better received in journals in mathematics, computer science or even software engineering and not in journals dedicated to innovation. Research that develops a new way of doing encryption, or even a more efficient algorithm that allows increasing the capacity of transactions per second, the design of a new network or a new ecosystem based on blockchain or even research that develops improvements in consensus protocols of blockchain undoubtedly has great value but would be expected in engineering or math journals. On the other hand, there are studies that bring reports of implementations of a business application on a blockchain basis, either as a business support application, or in the form of designing a blockchain-based product that will be taken to the market (Wan et al., 2022). In these cases, applied research, from the point of view of innovation research, what is expected to be found in the article is the development of knowledge that demonstrates how, why or to what extent the innovation processes were sensitized, or in what way the process of innovation contributed or presented limitations to support the reported implementation. In this way, such research can be received as technological articles, since the theoretical elements that relate the innovation process, or the management of the innovation process with the implementation based on token or blockchain, will be present, which are the bases of analysis used to support the expansion of innovation theories, innovation management or management practices in innovation contexts.Finally, we invite the entire community to submit papers with theoretical discussions related to paradigm shifts, involving the dematerialized nature of new products and their tendency towards a service-oriented view (Jain et al., 2022).As it should be clear, this editorial comment did not explore all the possibilities of research in innovation management involving technologies based on tokens and blockchain, but only a few examples that can help to obtain insights. We intend, in some way, to encourage the innovation community to develop studies considering new technologies, developing, or expanding theories and knowledge of innovation.
APA, Harvard, Vancouver, ISO, and other styles
16

Ankita Hatiskar. "Data Mining Based Soft Computing Methods For Web Intelligence." International Journal of Advanced Research in Science, Communication and Technology, October 1, 2022, 1–6. http://dx.doi.org/10.48175/ijarsct-7131.

Full text
Abstract:
Data mining is the procedure of extracting interesting knowledge from enormous amounts of data contained in databases, including such patterns, associations, changes, deviations, and prominent structures. Soft Computing Methods such as fuzzy logic, artificial neural network, etc. aims to uncover the potential for error and inaccuracy in order to accomplish scalability, durability, and reduced methods. In today's information age, the Web is the most common distribution medium. Due to its popularity on the Internet, it is widely used in commercial, entertainment, and educational purposes. Web Intelligence (WI) is engaged with the scientific study of the Web's new areas. It is a new area of computer science that integrates artificial intelligence with sophisticated information technology in the framework of the Web, expanding well outside each one of them. In online applications, data mining gives a plethora of possibilities. The biggest concern is figuring out how to identify relevant hidden patterns for improved application. Soft computing techniques such as neural networks, fuzzy logic, support vector machines, and genetic algorithms are used in evolutionary computation to solve this problem. We look at how soft computing approaches may be used to build web intelligences in this research.
APA, Harvard, Vancouver, ISO, and other styles
17

Falchi de Magalhães, Fábio Luís, Marcos Antonio Gaspar, Edimara Mezzomo Luciano, and Domingos Márcio Rodrigues Napolitano. "Information technology governance: legitimation, theorization and field trends." Revista de Gestão ahead-of-print, ahead-of-print (December 30, 2020). http://dx.doi.org/10.1108/rege-01-2020-0001.

Full text
Abstract:
Purposeinvestigate and analyze the aspects of legitimation, theorization and trends for the evolution of research in information technology governance (ITG) in Brazil, according to researchers familiar with the matter.Design/methodology/approachBy means of a qualitative and quantitative research of exploratory-descriptive approach, the Delphi method was applied using a questionnaire supported by content analysis.FindingsITG is an increasingly interdisciplinary research field, with significant help from other fields of knowledge, such as administration, computer science and engineering. The main means of ITG publication are periodicals (MISQ, JMIS, JISTEM RESI), scientific events (AMCIS, ECIS, HICSS, EnANPAD, CONTECSI) and researchers, such as Peter Weill and Edimara Mezzomo Luciano. Best practice models are the most significant theoretical frameworks, and the main trend of research are on emerging technologies such as cloud computing and Internet of things (IoT) in the context of ITG.Research limitations/implicationsTo the unavailability of some researchers to participate in the second phase of the Delphi research performed, as well as the non-completion of a third Delphi round. Likewise, the “Block B (open answer questions)” it was not contemplated in the second phase for a new collection of answers, which could partially change the results presented here.Practical implicationsThe results show important insights for ITG researchers that can allow new researches about its applications, jointly reflecting on relevant aspects for the advancement of this research field.Social implicationsThere are several research contributions to broaden the discussion and the evolution of this new scientific field in Brazil and that can be grouped for each set of stakeholders: academia and related researchers; the practicing community of business managers and private and public organizations; the academic legitimizing bodies; the non-academic legitimating bodies and researchers from other areas of knowledge.Originality/valueITG is a concept that emerged as part of corporate governance (CG), which has evolved as an emerging theme and is expanding in the international academic arena. However, the current stage of legitimation, theorization and trends of ITG in the Brazilian researches are lacked greater understanding, in order to provide better targeting for new researches.
APA, Harvard, Vancouver, ISO, and other styles
18

Temchyshyna, Yuliia. "TAX ADMINISTRATION AND CONTROL IN THE CONTEXT OF PROVIDING ECONOMIC DEVELOPMENT OF ECONOMIC ACTIVITIES." Intellect XXІ, no. 2, 2022 (2022). http://dx.doi.org/10.32782/2415-8801/2022-2.14.

Full text
Abstract:
The article examines the current system of tax administration in Ukraine. The purpose of the study is to determine areas for improving tax administration. The information base of the research is the works of domestic and foreign scientists, regulatory and legal documents. Digitalization is a determining factor in business today. The development of information technologies provides new powerful tools for both business and government agencies. Digitalization of tax administration and control will improve the interaction between taxpayers and tax authorities. Digitalization will reduce the costs of tax administration and control and increase the speed and quality of tax services. World practice shows that the tax authorities of various countries are increasingly using modern information collection and analysis technologies. The main types of digital technologies that can be applied in tax administration and control include: Artificial intelligence, Machine vision, Robotic process automation, Cloud computing, Big data, Blockchain, Cognitive automation, Data cubing. The expected result of digitalization of tax administration and control includes: reduction of administrative burden and administrative barriers; expanding the ability of taxpayers to self-service, which will reduce the costs of tax authorities' personnel; formation of trust relations between tax authorities and taxpayers; reduction of time spent on data processing and information exchange; reducing the risks of technical errors; increasing the speed of searching for necessary documents; strengthening control of individual operations of taxpayers; the possibility of remote control; reduction of time for conducting tax audits and reduction of their number; improving the quality of inspections. Under the conditions of digitalization, the shadow economy will be reduced, the openness of information will create obstacles for committing tax offenses, and this will increase tax revenues for budgets of various levels, digitalization will create conditions for voluntary compliance with tax legislation by taxpayers.
APA, Harvard, Vancouver, ISO, and other styles
19

Nixon, Zachary, Carl Childs, John Tarpley, and Ben Shorr. "Evolving NOAA SCAT Data Management Standard." International Oil Spill Conference Proceedings 2021, no. 1 (May 1, 2021). http://dx.doi.org/10.7901/2169-3358-2021.1.689533.

Full text
Abstract:
ABSTRACT To address the growing detail, complexity, and volume of data collected and developed during oil spill response, and facilitate data sharing and conversion between data collection and storage and management systems across diverse parties to a response, the National Oceanic and Atmospheric Administration (NOAA) Office of Response and Restoration (ORR) has developed and published a data management standard for observational Shoreline Cleanup Assessment Technique (SCAT). The standard was cooperatively developed by NOAA and others in the response community over the past three years through a series of workshops and meetings. The standard is agnostic about physical spill environment, data collection methods, algorithms, software and computing environment, and requires only the most basic structured data to preserve the maximum flexibility for spill specific conditions and the unanticipated needs of future data collection. NOAA is also in the process of expanding the role of the DIVER (Data Integration Visualization Exploration and Reporting) centralized data warehouse and query tools used to house, query and visualize analytical results, field observations, photos and other information. As part of this effort, DIVER is being expanded to ingest and store SCAT data compliant with the standard using a SCAT data management standard. We anticipate that the use of the standard will be mandated as part of data sharing agreements put in place for future incidents for spills involving NOAA or other federal agencies. As such, we seek to widely disseminate information about the standard to the spill response community. Here, we discuss the components of the standard in detail, and provide information on available documentation, example data, file interchange formats, and methods to provide feedback to NOAA.
APA, Harvard, Vancouver, ISO, and other styles
20

Pankajashan, Savaridassan, G. Maragatham, and T. Kirthiga Devi. "Hybrid approach with Deep Auto-Encoder and optimized LSTM based Deep Learning approach to detect anomaly in cloud logs." Journal of Intelligent & Fuzzy Systems, December 4, 2021, 1–15. http://dx.doi.org/10.3233/jifs-201707.

Full text
Abstract:
Anomaly-based detection is coupled with recognizing the uncommon, to catch the unusual activity, and to find the strange action behind that activity. Anomaly-based detection has a wide scope of critical applications, from bank application security to regular sciences to medical systems to marketing apps. Anomaly-based detection adopted by various Machine Learning techniques is really a type of system that consists of artificial intelligence. With the ever-expanding volume and new sorts of information, for example, sensor information from an incontestably enormous amount of IoT devices and from network flow data from cloud computing, it is implicitly understood without surprise that there is a developing enthusiasm for having the option to deal with more conclusions automatically by means of AI and ML applications. But with respect to anomaly detection, many applications of the scheme are simply the passion for detection. In this paper, Machine Learning (ML) techniques, namely the SVM, Isolation forest classifiers experimented and with reference to Deep Learning (DL) techniques, the proposed DA-LSTM (Deep Auto-Encoder LSTM) model are adopted for preprocessing of log data and anomaly-based detection to get better performance measures of detection. An enhanced LSTM (long-short-term memory) model, optimizing for the suitable parameter using a genetic algorithm (GA), is utilized to recognize better the anomaly from the log data that is filtered, adopting a Deep Auto-Encoder (DA). The Deep Neural network models are utilized to change over unstructured log information to training ready features, which are reasonable for log classification in detecting anomalies. These models are assessed, utilizing two benchmark datasets, the Openstack logs, and CIDDS-001 intrusion detection OpenStack server dataset. The outcomes acquired show that the DA-LSTM model performs better than other notable ML techniques. We further investigated the performance metrics of the ML and DL models through the well-known indicator measurements, specifically, the F-measure, Accuracy, Recall, and Precision. The exploratory conclusion shows that the Isolation Forest, and Support vector machine classifiers perform roughly 81%and 79%accuracy with respect to the performance metrics measurement on the CIDDS-001 OpenStack server dataset while the proposed DA-LSTM classifier performs around 99.1%of improved accuracy than the familiar ML algorithms. Further, the DA-LSTM outcomes on the OpenStack log data-sets show better anomaly detection compared with other notable machine learning models.
APA, Harvard, Vancouver, ISO, and other styles
21

Benneworth, Paul. "The Machine as Mythology." M/C Journal 2, no. 6 (September 1, 1999). http://dx.doi.org/10.5204/mcj.1784.

Full text
Abstract:
Machinofacture, computer control and globalisation have created the appearance that in the relation between humanity and the machine the human possesses ever-deepening power. However, this is a very Whiggish view of the history of science and technology as a field of ever-expanding knowledge. History is littered with examples of technologies which have been abandoned as out-dated, then later attempts to revive them have failed because the expertise has been lost. Technology is not merely a reflection of human needs, but an embodiment of the human condition. Machines can be seen as products of their creator, but in the case of long-lived machines they can out-live their creator whilst embodying some of their expertise and their failings. If there is a human need for that lost experience contained within the machine, then there is a form of remote power exercised through the machine. Although the machine can be owned, and the owner 'controls' the machine, it is not a deity-subject (uni-directional) relation; the machine may fail -- because the master does not understand the processes of the machine, there is no way to enforce the power of ownership. This potential for control loss has resonances with the 'Frankenstein syndrome' where the fear is that humanity could unleash something beyond its control. This fear has found recent expression in the debate about genetically-modified (GM) foods in Europe, taking place not over the results of scientific tests; indeed the debate precedes those tests and concerns the effects of releasing them from the direct (space-time) control by humans in laboratories. Frankenstein's monster and GM-foods share the common trait that both are organic, and it makes more sense that a sentient or at least living object could upset the human-object power relation. The inanimate analogue of this (e.g. the golem of Jewish folklore) has a much weaker hold over popular consciousnesses. Asimov 'built' his robots with the laws of robotics to prevent upsetting the hegemony of human over machine. Even huge advances recently in computing power, neural networks and artificial intelligence have come nowhere near producing an Asimov robot with the freedom to have and exercise power over humanity. However, there are other more mundane and diffuse ways that machines can have power over humans. The company Joyce-Loebl, based in the North East of England, from the 1950s to the 1970s built thousands of microdensitometers, and through the effort of its sales teams sold them all over the world. The company was like a family; little was done in the way of formal drawings -- even the machinists were highly skilled and exercised great initiative; the 'secrets' of the machine were passed through incredibly elaborate apprenticeships, and were diffused into many individuals in a range of trades. The machine's inventor described it in correspondence thus: "many scientific measurements result in a series of darkened bars similar to a barcode. To interpret these bars it is necessary to measure their density. The microdensitometer does this by balancing the signal from the bars with light passing through an optical wedge. This balancing technique gives great accuracy". These machines did not embody absolute power of humans over machines; they came about only because the highly place-specific and combined efforts of a number of highly-skilled complementary craftsmen. At a time when the region was said to be "good for the nearest inch" (i.e. good at shipbuilding) the company made instruments that were "good to the nearest thousandth [of an inch]" (i.e. as precise as clockwork). Loebl, in his forthcoming memoirs, relates a number of examples where the microdensitometer conferred the power to influence human life even when it was notionally under anthropological control. It found a crashed moon probe from a lunar satellite photograph when all other analyses had failed, and allowed him, as a one-time refugee from the Nazis, to snub the apartheid regime by refusing to sell machines to South African firms. More palpably, it disproved the evidence in a murder appeal where the machine 'proved' that the rope submitted as evidence could not have produced the marks on the neck of the strangulated wife (legal power). Although the machine required an operator to use, in common with many technologies today, there is a separation between the knowledge necessary to manufacture the microdensitometer, and that required to make it carry out it designated functions. It appeared for a time as if microdensitometers were a commodity to be bought and sold; humans controlled them absolutely through determining where they were located. The appearance of absolute control only arose out of a particular techno-economic configuration particular to the 1960s, dependent on the mass-production and mass marketing of the machine. When this configuration disintegrated, so the balance of power shifted towards the machine. Joyce-Loebl broke up in the 1980s; technologies moved towards analytic software rather than electro-mechanical measurement; the skills of craftsmen were lost; the instrument teams drifted. Electronic instrument standardisation and the effects of the PC on software seemed to spell the end for analogue hardware. However, the microdensitometer remains the most precise instrument for the measurement of grey scale on photograph emulsions, yet the skills to produce microdensitometers have been lost. The Soviets tried for over a decade to reverse engineer the machine, even copying faults in a screw thread, but the machine steadfastly 'refused' to be copied, and the imitation would not work (geopolitical power). One film-manufacturing multi-national firm has paid thousands of pounds for the refurbishment of one such device from the 1970s (commercial power). The device is still in use in scientific, medical and engineering installations world-wide (technical power). Joyce-Loebl broke up in the 1980s; technologies moved towards analytic software rather than electro-mechanical measurement; the skills of craftsmen were lost; the instrument teams drifted. Electronic instrument standardisation and the effects of the PC on software seemed to spell the end for analogue hardware. However, the microdensitometer remains the most precise instrument for the measurement of grey scale on photograph emulsions, yet the skills to produce microdensitometers have been lost. The Soviets tried for over a decade to reverse engineer the machine, even copying faults in a screw thread, but the machine steadfastly 'refused' to be copied, and the imitation would not work (geopolitical power). One film-manufacturing multi-national firm has paid thousands of pounds for the refurbishment of one such device from the 1970s (commercial power). The device is still in use in scientific, medical and engineering installations world-wide (technical power). Value is not identical to power, but arises in the independence the machines have as bearers of the skills of their creators. It is not just the skill embodied in those machines, but the machines arise because of the particular contingency of their creation. Although design conventions can exist, machines are purposively designed and manufactured, the outcomes of these processes affecting their final state. The machine is not just the creature its maker desires, but like Frankenstein's Monster, emerges from a struggle to shape the raw materials to the designer's ends, and records that struggle for posterity. In the case of the micro-densitometer, understanding the reasons for the precise arrangement of the various optics, mechanisms, metal and electronics is impossible. However, in the machine lies a series of messages about the context of the creation of the machine. The North East of England is a declining industrial region; the machine can be read as a recipe for creating material success in a high-technology industry in the North East even given the absence of contemporary activity -- 'assemble a range of disparate craft skills, make a branded product, sell globally, find new avenues for your skill base'. Mythology has served a similar purpose in a number of ancient civilisations. To westerners raised on an abstract, Kiplingesque diet of 'native tales' providing neat explanations of natural phenomena, these myths might appear pointless, but even today, in their context of a particular location, contain highly encoded cultural information for survival and edification (e.g. Australian Aboriginal peoples). The power of these myths provided access to extensive micro-zoological and anthropological observation and understanding without necessarily understanding why. The Joyce-Loebl microdensitometer came out of particular situation in the economy of the North East of England which has materially all but vanished. Messrs. Joyce and Loebl built a company making branded equipment selling worldwide, in a way that was and is supposed to be impossible for a heavy industrial region, whose cultural traits of the industrial structure are supposed to endure in the communitarian and anti-entrepreneurial aspirations of the working classes. However, the microdensitometer challenges the notion that the North East was only a centre of heavy industry, but was once somewhere where instruments of beauty and purpose were fashioned and sold. The Joyce-Loebl microdensitometer came out of particular situation in the economy of the North East of England which has materially all but vanished. Messrs. Joyce and Loebl built a company making branded equipment selling worldwide, in a way that was and is supposed to be impossible for a heavy industrial region, whose cultural traits of the industrial structure are supposed to endure in the communitarian and anti-entrepreneurial aspirations of the working classes. However, the microdensitometer challenges the notion that the North East was only a centre of heavy industry, but was once somewhere where instruments of beauty and purpose were fashioned and sold. Just as the Story of the Dreaming explains that "storytelling, while explaining the past, helps young Indigenous Australians maintain dignity and self-respect in the present", there is a modern role for past machines in helping the inhabitants of declining industrial regions maintain their dignity and sustain themselves economically into the future. Much of the debate about industrial renewal in the UK has recently focussed around the notion of the knowledge economy in the abstract form; the microdensitometer is the embodiment of how a knowledge economy can be created. This suggests three potential ways of understanding a machine beyond the delivery of a piece of technological functionality within a production paradigm. A machine can at once have and exercise technological, political and cultural power when the constraints of its control are removed. This brings us back to the starting point of the article, the idea of the Frankenstein monster, who demonstrated a highly spectacular specific physical power; in a modern(-ist?) reality, the power of many 'rogue machines' (those beyond tight contextual control) is entirely more mundane, diffuse and abstract, yet represents a real influence on life experiences in the modern world. Citation reference for this article MLA style: Paul Benneworth. "The Machine as Mythology -- The Case of the Joyce-Loebl Microdensitometer." M/C: A Journal of Media and Culture 2.6 (1999). [your date of access] <http://www.uq.edu.au/mc/9909/micro.php>. Chicago style: Paul Benneworth, "The Machine as Mythology -- The Case of the Joyce-Loebl Microdensitometer," M/C: A Journal of Media and Culture 2, no. 6 (1999), <http://www.uq.edu.au/mc/9909/micro.php> ([your date of access]). APA style: Paul Benneworth. (1999) The machine as mythology -- the case of the Joyce-Loebl microdensitometer. M/C: A Journal of Media and Culture 2(6). <http://www.uq.edu.au/mc/9909/micro.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
22

Crouch, David, and Katarina Damjanov. "Extra-Planetary Digital Cultures." M/C Journal 18, no. 5 (August 20, 2015). http://dx.doi.org/10.5204/mcj.1020.

Full text
Abstract:
Digital culture, as we know it, owes much to space exploration. The technological pursuit of outer space has fuelled innovations in signal processing and automated computing that have left an impact on the hardware and software that make our digital present possible. Developments in satellite technologies, for example, produced far-reaching improvements in digital image processing (Gonzalez and Woods) and the demands of the Apollo missions advanced applications of the integrated circuit – the predecessor to the microchip (Hall). All the inventive digital beginnings in space found their way back to earth and contributed to the development of contemporary formations of culture composed around practices dependent on and driven by digital technologies. Their terrestrial adoption and adaptation supported a revolution in information, mediation and communication technologies, increasing the scope and speed of global production, exchange and use of data and advancing techniques of imaging, mapping, navigation, surveillance, remote sensing and telemetry to a point that could only be imagined before the arrival of the space age. Steadily knotted with contemporary scientific, commercial and military endeavours and the fabric of the quotidian, digital devices and practices now have a bearing upon all aspects of our pursuits, pleasures and politics. Our increasing reliance upon the digital shaped the shared surfaces of human societies and produced cultures in their own right. While aware of the uneasy baggage of the term ‘culture’, we use it here to designate all digitally grounded objects, systems and processes which are materially and socially inflecting our ways of life. In this sense, we consider both what Michael Hardt and Antonio Negri describe as “those results of social production that are necessary for social interaction and further production, such as knowledges, languages, codes, information, affects, and so forth” (viii), and the material contexts of these products of the social. The effects of digital technologies on the socio-material ambits of human life are many and substantial and – as we want to suggest here – evolving through their ‘extraterrestrial’ beginnings. The contemporary courses of digital cultures not only continue to develop through investments in space exploration, they are themselves largely contingent on the technologies that we have placed in outer space, for instance, global telecommunications infrastructure, GPS, Google maps, weather and climate monitoring facilities and missile grids all rely on the constellation of satellites orbiting the earth. However, we have been increasingly witnessing something new: modes of social production that developed on earth from the technical demands of the space age are now being directed, or rather returned back to have new beginnings beyond the globe. Our focus in this paper is this outward momentum of digital cultures. We do not aim to overview the entire history of the digital in outer space, but instead to frame the extraterrestrial extension of human technologies in terms of the socio-material dimensions of extra-planetary digital cultures. Hannah Arendt described how the space age accelerated the already rapid pace of techno-scientific development, denying us pause during which to grasp its effects upon the “human condition”. Our treacherously fast technological conquest of outer space leaves in its wake an aporia in language and “the trouble”, as Arendt puts it, is that we will “forever be unable to understand, that is, to think and speak about the things which nevertheless we are able to do” (3). This crisis in language has at its core a problem of ontology: a failure to recognise that the words we use to describe ourselves are always, and have always been, bound up in our technological modes of being. As thinkers such as Gilbert Simondon and Bernard Stiegler argued and Arendt derided (but could not deny), our technologies are inseparably bound up with the evolutionary continuum of the human and the migration of our digital ways of life into outer space still further complicates articulation of our techno-logic condition. In Stiegler’s view the technical is the primordial supplement to the human into which we have been “exteriorising” our “interiors” of social memory and shared culture to alter, assert and advance the material-social ambits of our living milieu and which have been consequently changing the idea of what it is to be human (141). Without technologies – what Stiegler terms “organised inorganic matter” (17), which mediate our relationships to the world – there is no human in the inhuman extraterrestrial environment and so, effectively, it is only through the organisation of inert matter that culture or social life can exist outside the earth. Offering the possibility of digitally abstracting and processing the complexities and perils of outer space, space technologies are not only a means of creating a human milieu ‘out there’, but of expediting potentially endless extra-planetary progress. The transposition of digital culture into outer space occasions a series of beginnings (and returns). In this paper, we explore extra-planetary digital culture as a productive trajectory in broader discussions of the ontological status of technologies that are socially and materially imbricated in the idea of the human. We consider the digital facilitation of exchanges between earth and outer space and assign them a place in an evolving discourse concerned with expressing the human in relation to the technological. We suggest that ontological questions occasioned by the socio-material effects of technologies require consideration of the digital in outer space and that the inhuman milieu of the extraterrestrial opens up a unique perspective from which to consider the nascent shape of what might be the emerging extra-planetary beginnings of the post human. Digital Exurbias The unfolding of extra-planetary digital cultures necessitates the simultaneous exteriorisation of our production of the social into outer space and the domestication of our extraterrestrial activities here on earth. Caught in the processes of mediated exploration, the moon, Mars, Pluto and other natural or human-made celestial bodies such as the International Space Station are almost becoming remote outer suburbs – exurbias of earth. Digital cultures are reaching toward and expanding into outer space through the development of technologies, but more specifically through advancing the reciprocal processes of social exchanges between terrestrial and extraterrestrial space. Whether it be through public satellite tracking via applications such as Heavens-Above or The High Definition Earth Viewing system’s continuous video feed from the camera attached to the ISS (NASA, "High Definition") – which streams us back an image of our planetary habitat from an Archimedean point of view – we are being encouraged to embrace a kind of digital enculturation of extraterrestrial space. The production of social life outside our own planet has already had many forms, but perhaps can be seen most clearly aboard the International Space Station, presently the only extraterrestrial environment physically occupied by humans. Amongst its many landmark events, the ISS has become a vigorous node of social media activity. For example, in 2013 Chris Hadfield became a Twitter phenomenon while living aboard the ISS; the astronaut gathered over a million Twitter followers, he made posts on Facebook, Tumblr and Reddit, multiple mini-vids, and his rendition of David Bowie’s Space Oddity on YouTube (Hadfield) has thus far been viewed over 26 million times. His success, as has been noted, was not merely due to his use of social media in the unique environment of outer space, but rather that he was able to make the highly technical lives of those in space familiar by revealing to a global audience “how you make a sandwich in microgravity, how you get a haircut” (Potter). This techno-mediation of the everyday onboard ISS is, from a Stieglerian perspective, a gesture toward the establishment of “the relation of the living to its milieu” (49). As part of this process, the new trends and innovations of social media on earth are, for example, continuously replayed and rehearsed in the outer space, with a litany of ‘digital firsts’ such as the first human-sent extraterrestrial ‘tweet’, first Instagram post, first Reddit AMA and first Pinterest ‘pin’ (Knoblauch), betraying our obsessions with serial digital beginnings. The constitution of an extra-planetary milieu progresses with the ability to conduct real-time interactions between those on and outside the earth. This, in essence, collapses all social aspects of the physical barrier and the ISS becomes merely a high-tech outer suburb of the globe. Yet fluid, uninterrupted, real-time communications with the station have only just become possible. Previously, the Iinternet connections between earth and the ISS were slow and troublesome, akin to the early dial-up, but the recently installed Optical Payload for Lasercomm Science (OPAL), a laser communications system, now enables the incredible speeds needed to effortlessly communicate with the human orbital outpost in real-time. After OPAL was affixed to the ISS, it was first tested using the now-traditional system test, “hello, world” (NASA, "Optical Payload"); referencing the early history of digital culture itself, and in doing so, perhaps making the most apt use of this phrase, ever. Open to Beginnings Digital technologies have become vital in sustaining social life, facilitating the immaterial production of knowledge, information and affects (Hardt and Negri), but we have also become increasingly attentive to their materialities; or rather, the ‘matter of things’ never went away, it was only partially occluded by the explosion of social interactivities sparked by the ‘digital revolution’. Within the ongoing ‘material turn’, there have been a gamut of inquiries into the material contexts of the ‘digital’, for example, in the fields of digital anthropology (Horst and Miller), media studies (Kirschenbaum, Fuller, Parikka) and science and technology studies (Gillespie, Boczkowski, and Foot) – to mention only a very few of these works. Outside the globe material things are again insistent, they contain and maintain the terrestrial life from which they were formed. Outer space quickens our awareness of the materiality underpinning the technical apparatus we use to mediate and communicate and the delicate support that it provides for the complex of digital practices built upon it. Social exchanges between earth and its extra-planetary exurbias are made possible through the very materiality of digital signals within which these immaterial interactions can take place. In the pared down reality of contemporary life in outer space, the sociality of the digital is also harnessed to bring forth forms of material production. For example, when astronauts in space recently needed a particular wrench, NASA was able to email them a digital file from which they were then able print the required tool (Shukman). Through technologies such as the 3D printer, the line between products of the social and the creation of material objects becomes blurred. In extra-planetary space, the ‘thingness’ of technologies is at least as crucial as it is on earth and yet – as it appears – material production in space might eventually rely on the infrastructures occasioned by the immaterial exchanges of digital culture. As technical objects, like the 3D printer, are evolving so too are conceptions of the relationship that humans have with technologies. One result of this is the idea that technologies themselves are becoming capable of producing social life; in this conception, the relationships and interrelationships of and with technologies become a potential field of study. We suggest here that the extra-planetary extension of digital cultures will not only involve, but help shape, the evolution of these relationships, and as such, our conceptions and articulations of a future beyond the globe will require a re-positioning of the human and technical objects within the arena of life. This will require new beginnings. Yet beginnings are duplicitous, as Maurice Blanchot wrote – “one must never rely on the word beginning”; technologies have always been part of the human, our rapport is in some sense what defines the human. To successfully introduce the social in outer space will involve an evolution in both the theory and practice of this participation. And it is perhaps through the extra-planetary projection of digital culture that this will come about. In outer space the human partnership with the objects of technology, far from being a utopian promise or dystopian end, is not only a necessity but also a productive force shaping the collective beginnings of our historical co-evolution. Objects of technology that migrate into space appear designed to smooth the ontological misgivings that might arise from our extra-planetary progress. While they are part of the means for producing the social in outer space and physical fortifications against human frailty, they are perhaps also the beginnings of the extraterrestrial enculturation of technologies, given form. One example of such technologies is the anthropomorphic robots currently developed by the Dextrous Robotics Laboratory for NASA. The latest iteration of these, Robotnaut 2 was the first humanoid robot in space; it is a “highly dexterous” robot that works beside astronauts performing a wide range of manual and sensory activities (NASA, "Robonaut"). The Robonaut 2 has recorded its own series of ‘firsts’, including being the “first robot inside a human space vehicle operating without a cage, and first robot to work with human-rated tools in space” (NASA, "Robonaut"). One of the things which mark it as a potential beginning is this ability to use the same tools as astronauts. This suggests the image of a tool using a tool – at first glance, something now quite common in the operation of machines – however, in this case the robot is able to manipulate a tool that was not designed for it. This then might also include the machine itself in our own origins, in that evolutionary moment of grasping a tool or stealing fire from the gods. As an exteriorisation of the human, these robots also suggest that a shared extra-planetary culture would involve acknowledging the participation of technologic entities, recognising that they share these beginnings with us, and thus are participating in the origins of our potential futures beyond the globe – the prospects of which we can only imagine now. Identifiably human-shaped, Robonauts are created to socialise with, and labour together with, astronauts; they share tools and work on the same complex tasks in the same environment aboard the International Space Station. In doing so, their presence might break down the separation between the living and the nonliving, giving form to Stiegler’s hypothesis regarding the ontology of technical objects, and coming to represent a mode of “being” described as “organized inert matter” (49). The robonaut is not dominated by the human, like a hand-held tool, nor is it dominating like a faceless system; it is engineered to be conducted, ‘organised’ rather than controlled. In addition to its anthropomorphic tendencies – which among other things, makes them appear more human than astronauts wearing space suits – is the robonaut’s existence as part of an assemblage of networked life that links technical objects with wet bodies into an animate system of information and matter. While this “heralds the possibility of making the technical being part of culture” (Simondon 16), it also suggests that extra-planetary digital cultures will harness what Simondon formulates as an “ensemble” of “open machines” – a system of sensitive technologies toward which the human acts as “organizer and as a living interpreter” (13). In the design of our extra-planetary envoys we are evolving toward this openness; the Robonaut, a technical object that shares in digital culture and its social and material production, might be the impetus through which the human and technological acquire a language that expresses a kind of evolutionary dialectic. As a system of inclusions that uses technologies to incorporate/socialise everything it can, including its own relationship with technical objects, digital culture in outer space clarifies how technologies might relate and “exchange information with each other through the intermediacy of the human interpreter” (Simondon 14). The Robonaut, like the tweeting astronaut, provides the test signals for what might eventually become points of communication between different modes of being. In this context, culture is collective cumulative memory; the ‘digital’ form of culture suggests an evolution of both technologic life and human life because it incorporates the development of more efficient means of storing and transmitting memory as cultural knowledge, while recognising the experience of both. Social learning and memory will first define the evolution of the Robonaut. Digital culture and the social expressed through technology – toward a shared social life and cultural landscape established in outer space – will involve the conservation, transmission and setting of common patterns that pool a composite interplay of material, neurobiologic and technologic variables. This will in turn require new practices of enculturation, conviviality with technologies, a sharing, incorporation and care. Only then might this transform into a discussion concerning the ontologies of the ‘we’. (Far from) Conclusions Hannah Arendt wrote that technologic progress could not find full expression in “normal” (3) language and that we must constantly be aware that our knowledge, politics, ethics and interactions with regard to technologies are incomplete, unformulated or unexpressed. It could be said then that our relationship with technologies is constantly beginning, that this need to keep finding new language to grasp it means that it actually progresses through its rehearsal of beginnings, through the need to maintain the productive inquisitive force of a pleasant first meeting. Yet Arendt’s idea emerges from a kind of contempt for technology and her implied separation between ‘normal’ and what could be called ‘technical’ language suggests that she privileges the lay ‘human’ tongue as the only one in which meaningful ideas can be properly expressed. What this fails to acknowledge is an appreciation of the potential richness of technical language and Arendt instead establishes a hierarchy that privileges one’s ‘natural’ language. The invocation of the term ‘normal’ is itself an admission of unequal relations with technologies. For a language to develop in which we can truly begin to express and understand the human relationship with ever-changing but ever-present technologies,, we must first allow the entrance of the language of technology into social life – it must be incorporated, learnt or translated. In the future, this might ultimately give technology a voice in a dialogue that might be half-composed of binary code. Digital culture is perhaps a forerunner of such a conversation and perhaps it is in the milieu of outer space that it could be possible to see advances in our ideas about the mutually co-constitutive relationship between the human and technical. The ongoing extra-planetary extension of the digital cultures have the productive potential to sculpt the material and social ambits of our world, and it is this capacity that may precipitate beginnings which will leave lasting imprints upon the prospects of our shared post-human futures. References Arendt, Hannah. The Human Condition. 2nd ed. Chicago: University of Chicago Press, 1958. Blanchot, Maurice. Friendship. Trans. Elizabeth Rottenberg. Stanford: Stanford University Press, 1997. Originally published in French in 1971 under the title L’Amitié. Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, MA: MIT Press, 2005. Gillespie, Tarleton, Pablo J. Boczkowski, and Kirsten A. Foot (eds.). Media Technologies: Essays on Communication, Materiality, and Society. Cambridge, Massachusetts: MIT Press, 2014. Gonzalez, Rafael, and Richard E. Woods. Digital Image Processing. 2nd ed. New Jersey: Prentice Hall, 2002. Hadfield, Chris. “Space Oddity.” YouTube, 12 May 2013. 10 Aug. 2015 ‹https://www.youtube.com/watch?v=KaOC9danxNo›. Hall, Eldon C. Journey to the Moon: The History of the Apollo Guidance Computer. Reston: American Institute of Aeronautics and Astronautics, 1996. Hardt, Michael, and Antonio Negri. Commonwealth. Cambridge, MA: Harvard University Press, 2009. Heavens-Above. ‹http://www.heavens-above.com›. Horst, Heather, and Daniel Miller. Digital Anthropology. London and New York: Berg, 2012. Kirschenbaum, Matthew. Mechanisms: New Media and the Forensic Imagination. Cambridge, MA: MIT Press, 2008. Knoblauch, Max. “The 8 First Social Media Posts from Space.” Mashable 13 Aug. 2013. ‹http://mashable.com/2013/08/13/space-social-media-firsts/›. NASA. “High Definition Earth-Viewing.” ‹http://www.nasa.gov/mission_pages/station/research/experiments/917.html›.NASA. “Optical Payload for Lasercomm Science (OPALS).” 13 May 2015. ‹http://www.nasa.gov/mission_pages/station/research/experiments/861.html›. NASA. “Robonaut Homepage.” ‹http://robonaut.jsc.nasa.gov/default.asp›. Parikka, Jussi. “Dust and Exhaustion: The Labour of New Materialism.” C-Theory 2 Oct. 2013. ‹http://www.ctheory.net/articles.aspx?id=726›. Potter, Ned. “How Chris Hadfield Conquered Social Media from Outer Space.” Forbes 28 Jul. 2013. ‹http://www.forbes.com/sites/forbesleadershipforum/2013/06/28/how-chris-hadfield-conquered-social-media-from-outer-space›. Shukman, David. “NASA Emails Spanner to Space Station - Analysis.” BBC News 19 Dec. 2014. ‹http://www.bbc.com/news/science-environment-30549341›. Simondon, Gilbert. On the Mode of Existence of Technical Objects. Paris: Aubier, Editions Montaigne, 1958. Trans. Ninian Mellamphy. University of Western Ontario, 1980. Stiegler, Bernard. Technics and Time 1: The Fault of Epimetheus. Stanford: Stanford University Press, 1998.
APA, Harvard, Vancouver, ISO, and other styles
23

Ali, Kawsar. "Zoom-ing in on White Supremacy." M/C Journal 24, no. 3 (June 21, 2021). http://dx.doi.org/10.5204/mcj.2786.

Full text
Abstract:
The Alt Right Are Not Alright Academic explorations complicating both the Internet and whiteness have often focussed on the rise of the “alt-right” to examine the co-option of digital technologies to extend white supremacy (Daniels, “Cyber Racism”; Daniels, “Algorithmic Rise”; Nagle). The term “alt-right” refers to media organisations, personalities, and sarcastic Internet users who promote the “alternative right”, understood as extremely conservative, political views online. The alt-right, in all of their online variations and inter-grouping, are infamous for supporting white supremacy online, “characterized by heavy use of social media and online memes. Alt-righters eschew ‘establishment’ conservatism, skew young, and embrace white ethnonationalism as a fundamental value” (Southern Poverty Law Center). Theoretical studies of the alt-right have largely focussed on its growing presence across social media and websites such as Twitter, Reddit, and notoriously “chan” sites 4chan and 8chan, through the political discussions referred to as “threads” on the site (Nagle; Daniels, “Algorithmic Rise”; Hawley). As well, the ability of online users to surpass national boundaries and spread global white supremacy through the Internet has also been studied (Back et al.). The alt-right have found a home on the Internet, using its features to cunningly recruit members and to establish a growing community that mainstream politically extreme views (Daniels, “Cyber Racism”; Daniels, “Algorithmic Rise; Munn). This body of knowledge shows that academics have been able to produce critically relevant literature regarding the alt-right despite the online anonymity of the majority of its members. For example, Conway et al., in their analysis of the history and social media patterns of the alt-right, follow the unique nature of the Christchurch Massacre, encompassing the use and development of message boards, fringe websites, and social media sites to champion white supremacy online. Positioning my research in this literature, I am interested in contributing further knowledge regarding the alt-right, white supremacy, and the Internet by exploring the sinister conducting of Zoom-bombing anti-racist events. Here, I will investigate how white supremacy through the Internet can lead to violence, abuse, and fear that “transcends the virtual world to damage real, live humans beings” via Zoom-bombing, an act that is situated in a larger co-option of the Internet by the alt-right and white supremacists, but has been under theorised as a hate crime (Daniels; “Cyber Racism” 7). Shitposting I want to preface this chapter by acknowledging that while I understand the Internet, through my own external investigations of race, power and the Internet, as a series of entities that produce racial violence both online and offline, I am aware of the use of the Internet to frame, discuss, and share anti-racist activism. Here we can turn to the work of philosopher Michel de Certeau who conceived the idea of a “tactic” as a way to construct a space of agency in opposition to institutional power. This becomes a way that marginalised groups, such as racialised peoples, can utilise the Internet as a tactical material to assert themselves and their non-compliance with the state. Particularly, shitposting, a tactic often associated with the alt-right, has also been co-opted by those who fight for social justice and rally against oppression both online and offline. As Roderick Graham explores, the Internet, and for this exploration, shitposting, can be used to proliferate deviant and racist material but also as a “deviant” byway of oppositional and anti-racist material. Despite this, a lot can be said about the invisible yet present claims and support of whiteness through Internet and digital technologies, as well as the activity of users channelled through these screens, such as the alt-right and their digital tactics. As Vikki Fraser remarks, “the internet assumes whiteness as the norm – whiteness is made visible through what is left unsaid, through the assumption that white need not be said” (120). It is through the lens of white privilege and claims to white supremacy that online irony, by way of shitposting, is co-opted and understood as an inherently alt-right tool, through the deviance it entails. Their sinister co-option of shitposting bolsters audacious claims as to who has the right to exist, in their support of white identity, but also hides behind a veil of mischief that can hide their more insidious intention and political ideologies. The alt-right have used “shitposting”, an online style of posting and interacting with other users, to create a form of online communication for a translocal identity of white nationalist members. Sean McEwan defines shitposting as “a form of Internet interaction predicated upon thwarting established norms of discourse in favour of seemingly anarchic, poor quality contributions” (19). Far from being random, however, I argue that shitposting functions as a discourse that is employed by online communities to discuss, proliferate, and introduce white supremacist ideals among their communities as well as into the mainstream. In the course of this article, I will introduce racist Zoom-bombing as a tactic situated in shitposting which can be used as a means of white supremacist discourse and an attempt to block anti-racist efforts. By this line, the function of discourse as one “to preserve or to reproduce discourse (within) a closed community” is calculatingly met through shitposting, Zoom-bombing, and more overt forms of white supremacy online (Foucault 225-226). Using memes, dehumanisation, and sarcasm, online white supremacists have created a means of both organising and mainstreaming white supremacy through humour that allows insidious themes to be mocked and then spread online. Foucault writes that “in every society the production of discourse is at once controlled, selected, organised and redistributed according to a certain number of procedures, whose role is to avert its powers and danger, to cope with chance events, to evade ponderous, awesome materiality” (216). As Philippe-Joseph Salazar recontextualises to online white supremacists, “the first procedure of control is to define what is prohibited, in essence, to set aside that which cannot be spoken about, and thus to produce strategies to counter it” (137). By this line, the alt-right reorganises these procedures and allocates a checked speech that will allow their ideas to proliferate in like-minded and growing communities. As a result, online white supremacists becoming a “community of discourse” advantages them in two ways: first, ironic language permits the mainstreaming of hate that allows sinister content to enter the public as the severity of their intentions is doubted due to the sarcastic language employed. Second, shitposting is employed as an entry gate to more serious and dangerous participation with white supremacist action, engagement, and ideologies. It is important to note that white privilege is embodied in these discursive practices as despite this exploitation of emerging technologies to further white supremacy, there are approaches that theorise the alt-right as “crazed product(s) of an isolated, extremist milieu with no links to the mainstream” (Moses 201). In this way, it is useful to consider shitposting as an informal approach that mirrors legitimised white sovereignties and authorised white supremacy. The result is that white supremacist online users succeed in “not only in assembling a community of actors and a collective of authors, on the dual territory of digital communication and grass-roots activism”, but also shape an effective fellowship of discourse that audiences react well to online, encouraging its reception and mainstreaming (Salazar 142). Continuing, as McBain writes, “someone who would not dream of donning a white cap and attending a Ku Klux Klan meeting might find themselves laughing along to a video by the alt-right satirist RamZPaul”. This idea is echoed in a leaked stylistic guide by white supremacist website and message board the Daily Stormer that highlights irony as a cultivated mechanism used to draw new audiences to the far right, step by step (Wilson). As showcased in the screen capture below of the stylistic guide, “the reader is at first drawn in by curiosity or the naughty humor and is slowly awakened to reality by repeatedly reading the same points” (Feinburg). The result of this style of writing is used “to immerse recruits in an online movement culture built on memes, racial panic and the worst of Internet culture” (Wilson). Figure 1: A screenshot of the Daily Stormer’s playbook, expanding on the stylistic decisions of alt-right writers. Racist Zoom-Bombing In the timely text “Racist Zoombombing”, Lisa Nakamura et al. write the following: Zoombombing is more than just trolling; though it belongs to a broad category of online behavior meant to produce a negative reaction, it has an intimate connection with online conspiracy theorists and white supremacy … . Zoombombing should not be lumped into the larger category of trolling, both because the word “trolling” has become so broad it is nearly meaningless at times, and because zoombombing is designed to cause intimate harm and terrorize its target in distinct ways. (30) Notwithstanding the seriousness of Zoom-bombing, and to not minimise its insidiousness by understanding it as a form of shitposting, my article seeks to reiterate the seriousness of shitposting, which, in the age of COVID-19, Zoom-bombing has become an example of. I seek to purport the insidiousness of the tactical strategies of the alt-right online in a larger context of white violence online. Therefore, I am proposing a more critical look at the tactical use of the Internet by the alt-right, in theorising shitposting and Zoom-bombing as means of hate crimes wherein they impose upon anti-racist activism and organising. Newlands et al., receiving only limited exposure pre-pandemic, write that “Zoom has become a household name and an essential component for parties (Matyszczyk, 2020), weddings (Pajer, 2020), school and work” (1). However, through this came the strategic use of co-opting the application by the alt-right to digitise terror and ensure a “growing framework of memetic warfare” (Nakamura et al. 31). Kruglanski et al. label this co-opting of online tools to champion white supremacy operations via Zoom-bombing an example of shitposting: Not yet protesting the lockdown orders in front of statehouses, far-right extremists infiltrated Zoom calls and shared their screens, projecting violent and graphic imagery such as swastikas and pornography into the homes of unsuspecting attendees and making it impossible for schools to rely on Zoom for home-based lessons. Such actions, known as “Zoombombing,” were eventually curtailed by Zoom features requiring hosts to admit people into Zoom meetings as a default setting with an option to opt-out. (128) By this, we can draw on existing literature that has theorised white supremacists as innovation opportunists regarding their co-option of the Internet, as supported through Jessie Daniels’s work, “during the shift of the white supremacist movement from print to digital online users exploited emerging technologies to further their ideological goals” (“Algorithmic Rise” 63). Selfe and Selfe write in their description of the computer interface as a “political and ideological boundary land” that may serve larger cultural systems of domination in much the same way that geopolitical borders do (418). Considering these theorisations of white supremacists utilising tools that appear neutral for racialised aims and the political possibilities of whiteness online, we can consider racist Zoom-bombing as an assertion of a battle that seeks to disrupt racial justice online but also assert white supremacy as its own legitimate cause. My first encounter of local Zoom-bombing was during the Institute for Culture and Society (ICS) Seminar titled “Intersecting Crises” by Western Sydney University. The event sought to explore the concatenation of deeply inextricable ecological, political, economic, racial, and social crises. An academic involved in the facilitation of the event, Alana Lentin, live tweeted during the Zoom-bombing of the event: Figure 2: Academic Alana Lentin on Twitter live tweeting the Zoom-bombing of the Intersecting Crises event. Upon reflecting on this instance, I wondered, could efforts have been organised to prevent white supremacy? In considering who may or may not be responsible for halting racist shit-posting, we can problematise the work of R David Lankes, who writes that “Zoom-bombing is when inadequate security on the part of the person organizing a video conference allows uninvited users to join and disrupt a meeting. It can be anything from a prankster logging on, yelling, and logging off to uninvited users” (217). However, this beckons two areas to consider in theorising racist Zoom-bombing as a means of isolated trolling. First, this approach to Zoom-bombing minimises the sinister intentions of Zoom-bombing when referring to people as pranksters. Albeit withholding the “mimic trickery and mischief that were already present in spaces such as real-life classrooms and town halls” it may be more useful to consider theorising Zoom-bombing as often racialised harassment and a counter aggression to anti-racist initiatives (Nakamura et al. 30). Due to the live nature of most Zoom meetings, it is increasingly difficult to halt the threat of the alt-right from Zoom-bombing meetings. In “A First Look at Zoom-bombings” a range of preventative strategies are encouraged for Zoom organisers including “unique meeting links for each participant, although we acknowledge that this has usability implications and might not always be feasible” (Ling et al. 1). The alt-right exploit gaps, akin to co-opting the mainstreaming of trolling and shitposting, to put forward their agenda on white supremacy and assert their presence when not welcome. Therefore, utilising the pandemic to instil new forms of terror, it can be said that Zoom-bombing becomes a new means to shitpost, where the alt-right “exploits Zoom’s uniquely liminal space, a space of intimacy generated by users via the relationship between the digital screen and what it can depict, the device’s audio tools and how they can transmit and receive sound, the software that we can see, and the software that we can’t” (Nakamura et al. 29). Second, this definition of Zoom-bombing begs the question, is this a fair assessment to write that reiterates the blame of organisers? Rather, we can consider other gaps that have resulted in the misuse of Zoom co-opted by the alt-right: “two conditions have paved the way for Zoom-bombing: a resurgent fascist movement that has found its legs and best megaphone on the Internet and an often-unwitting public who have been suddenly required to spend many hours a day on this platform” (Nakamura et al. 29). In this way, it is interesting to note that recommendations to halt Zoom-bombing revolve around the energy, resources, and attention of the organisers to practically address possible threats, rather than the onus being placed on those who maintain these systems and those who Zoom-bomb. As Jessie Daniels states, “we should hold the platform accountable for this type of damage that it's facilitated. It's the platform's fault and it shouldn't be left to individual users who are making Zoom millions, if not billions, of dollars right now” (Ruf 8). Brian Friedberg, Gabrielle Lim, and Joan Donovan explore the organised efforts by the alt-right to impose on Zoom events and disturb schedules: “coordinated raids of Zoom meetings have become a social activity traversing the networked terrain of multiple platforms and web spaces. Raiders coordinate by sharing links to Zoom meetings targets and other operational and logistical details regarding the execution of an attack” (14). By encouraging a mass coordination of racist Zoom-bombing, in turn, social justice organisers are made to feel overwhelmed and that their efforts will be counteracted inevitably by a large and organised group, albeit appearing prankster-like. Aligning with the idea that “Zoombombing conceals and contains the terror and psychological harm that targets of active harassment face because it doesn’t leave a trace unless an alert user records the meeting”, it is useful to consider to what extent racist Zoom-bombing becomes a new weapon of the alt-right to entertain and affirm current members, and engage and influence new members (Nakamura et al. 34). I propose that we consider Zoom-bombing through shitposting, which is within “the location of matrix of domination (white supremacy, heteropatriarchy, ableism, capitalism, and settler colonialism)” to challenge the role of interface design and Internet infrastructure in enabling racial violence online (Costanza-Chock). Conclusion As Nakamura et al. have argued, Zoom-bombing is indeed “part of the lineage or ecosystem of trollish behavior”, yet these new forms of alt-right shitposting “[need] to be critiqued and understood as more than simply trolling because this term emerged during an earlier, less media-rich and interpersonally live Internet” (32). I recommend theorising the alt-right in a way that highlights the larger structures of white power, privilege, and supremacy that maintain their online and offline legacies beyond Zoom, “to view white supremacy not as a static ideology or condition, but to instead focus on its geographic and temporal contingency” that allows acts of hate crime by individuals on politicised bodies (Inwood and Bonds 722). This corresponds with Claire Renzetti’s argument that “criminologists theorise that committing a hate crime is a means of accomplishing a particular type of power, hegemonic masculinity, which is described as white, Christian, able-bodied and heterosexual” – an approach that can be applied to theorisations of the alt-right and online violence (136). This violent white masculinity occupies a hegemonic hold in the formation, reproduction, and extension of white supremacy that is then shared, affirmed, and idolised through a racialised Internet (Donaldson et al.). Therefore, I recommend that we situate Zoom-bombing as a means of shitposting, by reiterating the severity of shitposting with the same intentions and sinister goals of hate crimes and racial violence. References Back, Les, et al. “Racism on the Internet: Mapping Neo-Fascist Subcultures in Cyber-Space.” Nation and Race: The Developing Euro-American Racist Subculture. Eds. Jeffrey Kaplan and Tore Bjørgo. Northeastern UP, 1993. 73-101. Bonds, Anne, and Joshua Inwood. “Beyond White Privilege: Geographies of White Supremacy and Settler Colonialism.” Progress in Human Geography 40 (2015): 715-733. Conway, Maura, et al. “Right-Wing Extremists’ Persistent Online Presence: History and Contemporary Trends.” The International Centre for Counter-Terrorism – The Hague. Policy Brief, 2019. Costanza-Chock, Sasha. “Design Justice and User Interface Design, 2020.” Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. Association for Computing Machinery, 2020. Daniels, Jessie. “The Algorithmic Rise of the ‘Alt-Right.’” Contexts 17 (2018): 60-65. ———. “Race and Racism in Internet Studies: A Review and Critique.” New Media & Society 15 (2013): 695-719. ———. Cyber Racism: White Supremacy Online and the New Attack on Civil Rights. Rowman and Littlefield, 2009. De Certeau, Michel. The Practice of Everyday Life. First ed. U of California P, 1980. Donaldson, Mike. “What Is Hegemonic Masculinity?” Theory and Society 22 (1993): 643-657. Feinburg, Ashley. “This Is The Daily Stormer’s Playbook.” Huffington Post 13 Dec. 2017. <http://www.huffpost.com/entry/daily-stormer-nazi-style-guide_n_5a2ece19e4b0ce3b344492f2>. Foucault, Michel. “The Discourse on Language.” The Archaeology of Knowledge and the Discourse on Language. Ed. A.M. Sheridan Smith. Pantheon, 1971. 215-237. Fraser, Vicki. “Online Bodies and Sexual Subjectivities: In Whose Image?” The Racial Politics of Bodies, Nations and Knowledges. Eds. Barbara Baird and Damien W. Riggs. Newcastle: Cambridge Scholars Publishing, 2015. 116-132. Friedberg, Brian, Gabrielle Lim, and Joan Donovan. “Space Invaders: The Networked Terrain of Zoom Bombing.” Harvard Shorenstein Center, 2020. Graham, Roderick. “Race, Social Media and Deviance.” The Palgrave Handbook of International Cybercrime and Cyberdeviance. Eds. Thomas J. Holt and Adam M. Bossler, 2019. 67-90. Hawley, George. Making Sense of the Alt-Right. Columbia UP, 2017. Henry, Matthew G., and Lawrence D. Berg. “Geographers Performing Nationalism and Hetero-Masculinity.” Gender, Place & Culture 13 (2006): 629-645. Kruglanski, Arie W., et al. “Terrorism in Time of the Pandemic: Exploiting Mayhem.” Global Security: Health, Science and Policy 5 (2020): 121-132. Lankes, R. David. Forged in War: How a Century of War Created Today's Information Society. Rowman & Littlefield, 2021. Ling, Chen, et al. “A First Look at Zoombombing, 2021.” Proceedings of the 42nd IEEE Symposium on Security and Privacy. Oakland, 2021. McBain, Sophie. “The Alt-Right, and How the Paranoia of White Identity Politics Fuelled Trump’s Rise.” New Statesman 27 Nov. 2017. <http://www.newstatesman.com/culture/books/2017/11/alt-right-and-how-paranoia-white-identity-politics-fuelled-trump-s-rise>. McEwan, Sean. “Nation of Shitposters: Ironic Engagement with the Facebook Posts of Shannon Noll as Reconfiguration of an Australian National Identity.” Journal of Media and Communication 8 (2017): 19-39. Morgensen, Scott Lauria. “Theorising Gender, Sexuality and Settler Colonialism: An Introduction.” Settler Colonial Studies 2 (2012): 2-22. Moses, A Dirk. “‘White Genocide’ and the Ethics of Public Analysis.” Journal of Genocide Research 21 (2019): 1-13. Munn, Luke. “Algorithmic Hate: Brenton Tarrant and the Dark Social Web.” VoxPol, 3 Apr. 2019. <http://www.voxpol.eu/algorithmic-hate-brenton-tarrant-and-the-dark-social-web>. Nagle, Angela. Kill All Normies: Online Culture Wars from 4chan and Tumblr to Trump and the Alt-Right. Zero Books, 2017. Nakamura, Lisa, et al. Racist Zoom-Bombing. Routledge, 2021. Newlands, Gemma, et al. “Innovation under Pressure: Implications for Data Privacy during the COVID-19 Pandemic.” Big Data & Society July-December (2020): 1-14. Perry, Barbara, and Ryan Scrivens. “White Pride Worldwide: Constructing Global Identities Online.” The Globalisation of Hate: Internationalising Hate Crime. Eds. Jennifer Schweppe and Mark Austin Walters. Oxford UP, 2016. 65-78. Renzetti, Claire. Feminist Criminology. Routledge, 2013. Ruf, Jessica. “‘Spirit-Murdering' Comes to Zoom: Racist Attacks Plague Online Learning.” Issues in Higher Education 37 (2020): 8. Salazar, Philippe-Joseph. “The Alt-Right as a Community of Discourse.” Javnost – The Public 25 (2018): 135-143. Selfe, Cyntia L., and Richard J. Selfe, Jr. “The Politics of the Interface: Power and Its Exercise in Electronic Contact Zones.” College Composition and Communication 45 (1994): 480-504. Southern Poverty Law Center. “Alt-Right.” <http://www.splcenter.org/fighting-hate/extremist-files/ideology/alt-right>. Wilson, Jason. “Do the Christchurch Shootings Expose the Murderous Nature of ‘Ironic’ Online Fascism?” The Guardian, 16 Mar. 2019. <http://www.theguardian.com/world/commentisfree/2019/mar/15/do-the-christchurch-shootings-expose-the-murderous-nature-of-ironic-online-fascism>.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography