Dissertations/Thesis

Clique aqui para acessar os arquivos diretamente da Biblioteca Digital de Teses e Dissertações da UnB

2024
Dissertations
1
  • Gleice Louise Garcia Costa dos Santos
  • Analysis of the Use of Educational Resources in a Virtual Learning Environment and the Impacts on Student Dropout
  • Advisor : LETICIA LOPES LEITE
  • COMMITTEE MEMBERS :
  • LETICIA LOPES LEITE
  • EDISON ISHIKAWA
  • SERGIO ANTONIO ANDRADE DE FREITAS
  • CINTIA INES BOLL
  • Data: Feb 15, 2024


  • Show Abstract
  • Many educational problems are complex and there are no quick fixes with immediate answers to solve them. Evasion is one of these problems and deserves special attention, which justifies the need to carry out studies, research and critical reflections on the nuances that underlie, permeate and constitute approaches that can contribute to understanding and combating this phenomenon.The use of Information and Communication Technology brings people together, even if they are physically distant, but it can create a problem for those who do not have access to technological resources, which makes it an obstacle for people who do not have or have not developed certain skills.The technology resources that Distance Education uses, through Virtual Learning Environments, allow the construction of knowledge to occur in different spaces, with teachers and students developing activities in different places or times. Thus, the use of these educational technology resources available in Virtual Learning Environments will be addressed in this work from a mapping of educational resources used in undergraduate courses of the Open University of Brazil program at the University of Brasília. This research is characterized by being an exploratory case study and a qualitative approach was adopted, in which the data
    analysis carried out identified the most used resources and related them to the dropout rate of the respective courses. The results obtained suggest that the diversity of resources can impact dropout rates.

2
  • Ilo César Duarte Cabral
  • Mining of Federal Legislative Data for Analysis and Prediction of Lawmaking

  • Advisor : GLAUCO VITOR PEDROSA
  • COMMITTEE MEMBERS :
  • EDUARDO DE PAULA COSTA
  • GLAUCO VITOR PEDROSA
  • JOHN LENON CARDOSO GARDENGHI
  • LUIS PAULO FAINA GARCIA
  • Data: Feb 26, 2024


  • Show Abstract
  • The release of legislative data by the Brazilian government opened an opportunity to understand aspects related to the legislative process. By analyzing historical patterns and relevant variables, it is possible to anticipate legislative results, optimizing the decision-making process. Predicting the votes of deliberative bodies, for example, can lead to a better understanding of government policies and thus generate actionable strategies, allowing legislators to identify critical issues, allocate resources efficiently and anticipate possible impasses. This work set out to investigate models for analysis and prediction that maximize the use of publicly accessible heterogeneous data from legislative data to understand the approval/disapproval of legislative proposals. To this end, classification models based on machine learning algorithms and natural language processing were developed on categorical, textual and processing data of Legislative Proposals, in order to identify discriminatory factors that could influence the approval of Bills and Amendment Projects. As a contribution, the classification models were evaluated in five scenarios, using different sets of attributes. The results obtained show an F1-Score of up to 70\% considering only the categorical data of the propositions and, when aggregating the processing data, it is possible to obtain an F1-Score of up to 97\%. The tests carried out demonstrate the feasibility of predicting the approval of a proposition during its flow in the legislative process, generating results that add knowledge and lead to a better understanding of aspects related to the Brazilian legislative process at the federal level.

3
  • RICARDO CORDEIRO GALVÃO SANT'ANA VAN ERVEN
  • UXAPP: Evaluation of the User Experience of Digital Products through Emotion Recognition

  • Advisor : EDNA DIAS CANEDO
  • COMMITTEE MEMBERS :
  • ANA PAULA BERNARDI DA SILVA
  • EDNA DIAS CANEDO
  • REJANE MARIA DA COSTA FIGUEIREDO
  • THIAGO DE PAULO FALEIROS
  • Data: Feb 28, 2024


  • Show Abstract
  • Context: measuring user experience (UX) is essential to create value in digital transformation. Measurement allows you to identify future purchase intentions, user loyalty, and retention. The traditional way of measuring user experience using self-assessment has problems. Therefore, we need a more straightforward approach to allow us to measure user experience automatically.
    Objective: implement and validate the UX evaluation model through a tool that automatically calculates the user experience rating of a digital product and presents positive, neutral, and negative points in using this product. The model included a work process, an exploratory experiment, and an application, which we call UXAPP.
    Methods: we identified and selected the state-of-art related to emotion recognition with Artificial Intelligence in the context of user satisfaction. After that, we proposed and implemented the UX evaluation model. We developed the UXAPP application and conducted the exploratory experiment following the work process. We invite nine participants to perform four tasks each. We collect manual input data and capture emotions from the user's video and speech. Then, we analyze both data and generate a UX report. Finally, we compare the results obtained from UXAPP and the user input data.
    Results: the UXAPP UX evaluation directly matches the user evaluation in 50\% of the tasks and gives a close answer in others 47.22\% of the results. Each UX element, like usability, affect, and user value, was evaluated and analyzed independently. The UXAPP usability analysis matches the user evaluation in 30.56\% of the tasks and gives a close answer in others 33.33\% of the results. The UXAPP affect analysis matches the user evaluation in 52.78\% of the tasks and gives a close answer in others 41.67\% of the results. UXAPP also identified 616 points of positive or negative sentiments in video and speech for all tasks. The UXAPP user value analysis matches the user evaluation in 36.11\% of the tasks and gives a close answer in 50.00\% of the results. We registered 5h 35m 31s of tasks’ duration. Based on our experience, we estimate that a person needed at least four times this task duration to process all the information presented in the UX report, including finding all sentiment points and distinguishing positive and negative sentiment peaks.
    Conclusions: the possibilities of applying emotion recognition are countless in terms of contexts, techniques, forms, and components. Despite this, it was possible to develop UXAPP and validate the UX evaluation model to measure the user experience of a digital product through emotion recognition and Artificial Intelligence. We found that UXAPP analysis and user evaluation analysis are important and complementary. The main contribution of this work is the delivery of a validated model to automatically measure user experience and identify positive, neutral, and negative points in the use of a digital product. UXAPP can be used directly by the final user, making it possible to automate all the efforts to obtain a UX evaluation score and identify user interaction's positive and negative points. It can drastically reduce the costs involved in developing a user experience competence in an organization. This research also defines a measurement experimentation process that any organization or researcher can reproduce.

     

4
  • Luan Borges dos Santos
  • SWPTMAC: Sleep Wake-up Power Transfer MAC Protocol

  • Advisor : MARCELO ANTONIO MAROTTA
  • COMMITTEE MEMBERS :
  • ANTONIO MARCOS ALBERTI
  • LUCAS BONDAN
  • MARCELO ANTONIO MAROTTA
  • MARCOS FAGUNDES CAETANO
  • Data: Feb 28, 2024


  • Show Abstract
  • Wireless Underground Sensor Networks (WUSNs) are complex systems comprised of subterranean sensors interconnected through wireless communication technologies. These networks fulfill a crucial role in monitoring subsurface environments. However, they grapple with a formidable challenge concerning their Network Lifetime (NL), which can be defined as the maximum duration over which the network remains operational and thus connected to a designated observation area. Given the paramount significance of prolonging NL to ensure comprehensive coverage of the observed region, the deployment of wireless power transfer stands out as a preeminent solution for augmenting NL. Nonetheless, the existing sleep-wakeup protocols have not been originally engineered to support this paradigm, which has subsequently resulted in suboptimal network performance. Therefore, we present a study to introduce a novel sleep-wakeup protocol explicitly tailored for wireless power transfer in WUSNs with the overarching aim of optimizing the network's operational lifetime called Sleep-Wakeup Power-Transfer Media Access Control (SWPTMAC). The evaluation of SWPTMAC has been conducted through comprehensive simulations leveraging the Castália simulator. The empirical findings disclosed an average improvement of approximately 24\% when contrasted against incumbent protocols in the domain.

5
  • Renan Silveira Holtermann
  • VALUE, ANALYSIS AND TREATMENT OF EDUCATIONAL RISKS: A STUDY APPLIED FROM THE DYNAMIC OF SYSTEMS

  • Advisor : RICARDO MATOS CHAIM
  • COMMITTEE MEMBERS :
  • LUCIANO HUGO MIRANDA FILHO
  • MARCELO LADEIRA
  • RICARDO MATOS CHAIM
  • SIMONE BORGES SIMAO MONTEIRO
  • Data: Apr 17, 2024


  • Show Abstract

  • main goal:Mapping the relationships in which a higher education institution is inserted, from the perspective of systems dynamics;Complementarily, the following are identified as specific objectives of the research:Develop a bibliographic review related to the research topic, including specific literature, studies by specialized organizations, decrees, norms and legislation issued by educational regulatory bodies;Understand the risk factors linked to the momentary transition from face-to-face classes to remote ones;Understand the risk factors linked to the return to face-to-face classes;Identify the educational impacts during the transition, based on indicators of stricto sensu graduate students at the studied institution;

6
  • Eliane Cunha Marques
  • RISK MANAGEMENT PROCESS MODEL FOR THE INFORMATION TECHNOLOGY ASSETS OF THE EBSERH NETWORK

  • Advisor : RICARDO MATOS CHAIM
  • COMMITTEE MEMBERS :
  • RICARDO MATOS CHAIM
  • GLADSTON LUIZ DA SILVA
  • MARCELO LADEIRA
  • ROSALVO ERMES STREIT
  • Data: Apr 24, 2024


  • Show Abstract
  • This study addresses the issue of risk management in Information Technology at the Brazilian Company of Hospital Services (Ebserh).The Ebserh Network is constituted by the Central Administration and its branches, composed of 41 federal university hospitals distributed throughout Brazil.It is known that there are standards, tools and methods that help companies manage their risks, however it was found that a single methodology would not be able to cover the specificities of Information Technology of all 41 federal university hospitals in the Ebserh Network.Given the above, the objective of this study is to propose a model of Risk Management Process for IT assets for Rede Ebserh.To this end, a bibliographical, documental, basic research was carried out with a qualitative, quantitative and descriptive approach in order to obtain the necessary theoretical basis for the development of the model and, in this way, the IT Risk Management Process of the Ebserh Network was obtained,based on ISO/IEC 31000, ISO/IEC 27005, internal regulations and those of the Federal Government, allowing better management of the uncertainties surrounding the implementation of Ebserh's IT organizational strategies.After preparing the Ebserh Network's IT Risk Management Process Model, it was validated and subsequently published and implemented in the Ebserh Network.For monitoring, techniques related to business intelligence (BI) were used, allowing the visualization of information through reports (dashboard) and generating knowledge for decision making.The resulting results showed that the IT risk management process model was satisfactory and that the data obtained from the BI application aligned with other internal tools, allowed Ebserh to obtain greater accuracy in viewing the information of its IT risks.This study showed that a well-implemented IT risk management process model, combined with BI techniques and quality data, can provide inputs for Ebserh to make better decisions according to the analysis of the company's IT risk levels.Network, as well as being used by other public institutions that need an IT risk management process model.

7
  • Diego Costa Sombra
  • Decision Support Model for Source Code Audit: A Case Study

  • Advisor : EDISON ISHIKAWA
  • COMMITTEE MEMBERS :
  • EDISON ISHIKAWA
  • GISLANE PEREIRA SANTANA
  • JOAO CARLOS FELIX SOUZA
  • MARCIO DE CARVALHO VICTORINO
  • Data: May 10, 2024


  • Show Abstract
  • With the growing demand in the software development market and increasingly tight deadlines driven by agile methodologies, the associated risks to these software have increased in recent years. Cybercriminals exploit security, quality, and compliance vulnerabilities to commit cybercrimes against companies, resulting in financial losses and damage to the organizations’ reputation. Therefore, it is essential for companies involved in the development and supply of such software to understand how to identify and prioritize issues that require immediate attention.
    In the literature, it was observed that existing studies address software risks in isolation and do not provide a consolidated view of these risks, lacking a model to assist in decisionmaking to prioritize which risk and which part of the software need to be addressed urgently.
    In response to this challenge, this study aimed to understand the risks involved and explore the methods, techniques, and tools available for validating these risks in market software. After identifying the risks, methods, techniques, and tools were applied to the software, validating the presence of these risks. Upon confirming the existence of the risks, the FAHP multicriteria decision support method was used to assist in risk classification, determining which part of the software and which risk should be prioritized first.
    The results indicated that, among the nine software modules, the Web module with 34.57%, combined with the Vulnerability risk with 50.35%, needs to be prioritized. This decision support model emerges as a contribution to decision-making, especially in the field of software engineering.

8
  • Denise Soares Magalhães
  • Framework for Assessing the Model Risk Management Maturity of Financial Institutions

  • Advisor : SIMONE BORGES SIMAO MONTEIRO
  • COMMITTEE MEMBERS :
  • ARI MELO MARIANO
  • JOAO GABRIEL DE MORAES SOUZA
  • RICARDO COSER MERGULHAO
  • SIMONE BORGES SIMAO MONTEIRO
  • Data: May 15, 2024


  • Show Abstract
  • As a result of advances in technology and exponential growth of data, the development and use of analytical (quantitative) models has become widespread in organizations. Parallel to the quantitative increase of the models, their complexity also grows. While this increases the speed of innovation, it also increases the level of risk the organization is exposed to, as well as the need for specific controls and governance. In this context, the present study aims to propose a practical method of assessing maturity in Model Risk Management, considering the main model risk factors and the best market practices used as Model Risk mitigating measures, covering technical and organizational. In order for this study to be possible, the methodology applied was used through a case study, with a qualitative approach, using as a data collection technique interviews with experts, as well as the researcher’s observation. Modeling of the Life Cycle Management process of Models in Financial Institutions was carried out, identifying and evaluating the Model Risk Factors, with an indication of actions to detect and predict the Model Risk. And, finally, as a result, a Qualitative method for Maturity Assessment in Model Risk Management in a Financial Institution is presented.

9
  • Felipe Gonçalves Pereira
  • Forecasting Inflation in Brazil with Machine Learning Methods: Integrating Shrinkage Method for Variable Selection with Shapley Value Interpretation

  • Advisor : JOAO GABRIEL DE MORAES SOUZA
  • COMMITTEE MEMBERS :
  • JOAO CARLOS FELIX SOUZA
  • JOAO GABRIEL DE MORAES SOUZA
  • MATHIAS SCHNEID TESSMANN
  • PENG YAOHAO
  • Data: May 20, 2024


  • Show Abstract
  • This dissertation investigates the effectiveness of non-linear machine learning (ML) models in predicting the Brazilian consumer price index (IPCA). By leveraging advanced ML techniques such as Random Forest, AdaBoost, Extreme Gradient Boosting (XGBoost), and Gradient Boosting, we aim to enhance the accuracy of inflation forecasting within an emerging market context. Spanning from August 2010 to January 2024, our analysis utilizes a comprehensive dataset comprising 156 predictors. To optimize model performance, we employ recursive feature elimination (RFE) with ElasticNet, isolating the 30 most influential predictors.

    Among the models examined, Gradient Boosting emerges as the most effective, exhibiting superior accuracy indicators. Notably, we enhance the interpretability of this winning model through the Shapley Value, an Explainable Artificial Intelligence (XAI) technique. By elucidating the individual contributions of variables, Shapley Value mitigates the "black box" nature of ML predictions, offering valuable insights into the dynamic interactions between predictor variables essential for strategic decision-making within institutions.

    This dissertation underscores the potential of integrating sophisticated ML methodologies with traditional macroeconomic tools to refine forecasting capabilities. It also points towards promising avenues for future research, such as exploring deep learning and hybrid models. Moreover, our findings extend beyond academia, offering practical insights for navigating complex economic landscapes.

10
  • Arthur Rocha Temporim de Lacerda
  • Gamified Chatbot Management Process: A way to build gamified chatbots

  • Advisor : SERGIO ANTONIO ANDRADE DE FREITAS
  • COMMITTEE MEMBERS :
  • ANDRE LUIZ PERON MARTINS LANNA
  • CHARLES ANDRYÊ GALVÃO MADEIRA
  • GIOVANNI ALMEIDA SANTOS
  • SERGIO ANTONIO ANDRADE DE FREITAS
  • Data: May 28, 2024


  • Show Abstract
  • Chatbot development frameworks offer diverse construction methods, but established processes, like the Chatbot Management Process (CMP), lack activities specifically designed to boost user engagement. This thesis proposes the Gamified Chatbot Management Process (GCMP), an extension of the CMP that incorporates and adapts activities to enhance user engagement with the chatbot. Three iterations of the GCMP were developed, each incorporating improvements guided by the Goal-Question-Metric (GQM) approach. This iterative approach facilitated the evaluation and evolution of the process. Real-user experiments demonstrated positive engagement, with 100% of participants achieving the proposed objectives. Additionally, the average deployment time decreased by 66% between the first and final versions. User evaluations were also awarded top marks for the quality of chatbot-generated responses. These findings highlight the effectiveness of the proposed GCMP. The results of the experiment suggest a positive correlation between the use of the GCMP and the improvement in the development of gamified chatbots. The observed improvements in both chatbot functionality and gamification techniques offer promising indicators for the widespread adoption of the GCMP as a robust and effective process for the development of gamified chatbots.

11
  • Alisson Melo Rios
  • Application of risk management in partnership accountability, with a focus on group consensus using Bayesian theory in multicriteria analysis

     

  • Advisor : JOAO CARLOS FELIX SOUZA
  • COMMITTEE MEMBERS :
  • ARI MELO MARIANO
  • Elaine Coutinho Marcial
  • JOAO CARLOS FELIX SOUZA
  • JOAO GABRIEL DE MORAES SOUZA
  • Data: Jun 7, 2024


  • Show Abstract
  • The Secretariat of Justice and Citizenship of the Federal District has partnerships with the third sector, commonly called Civil Society Organizations, which must issue an accountability report, at the end of the projects, for analysis by the public entity. The present work aims to use Risk Management to mitigate possible failures in the analysis of the accountability processes of Civil Social Organizations, in order to resolve the following issue: Public Administration can mitigate risks related to divergent measurements of the agreed results in terms of partnership, between public administration and Civil Social Organizations - CSOs, so that the perception of the weights of the criteria considered mainly by decision-makers, are similar between approval, approval with reservation or rejection of accounts? For this purpose, the external and internal context, criteria, identification, analysis and assessment of Risks were established, as well as the treatment action defined within the scope of the research. For this purpose, tools such as Failure Modes and Effects Analysis, Current Reality Tree and Hierarchical Analytical Process for Group Decision were used with calculations using the Bayesian theory before applying the Individual Aggregation of Judgments. The results showed that the use of the Bayesian theory rationalized the methodology for applying the Analytic Hierarchy Process tool, together with the Individual Aggregation of Judgments and that among the weights of the criteria in group consensus, the Social Impact proved to be irrelevant even being the main purpose of the work of Civil Society Organizations, which is revealed in another risk. When analyzing the weight of the criteria in the alternatives, it was observed that the criterion Cost is the one that most interferes and Efficiency least interferes in the specialist’s evaluation, as for approval, what interferes most is Effectiveness and what interferes least is Cost . 

2023
Dissertations
1
  • Jose Carlos Ferrer Simões
  • Evaluation of the Efficiency of Investments in Physical Security in a Financial Institution

  • Advisor : JOAO CARLOS FELIX SOUZA
  • COMMITTEE MEMBERS :
  • ARI MELO MARIANO
  • JOAO CARLOS FELIX SOUZA
  • JOAO GABRIEL DE MORAES SOUZA
  • PENG YAOHAO
  • Data: Jan 6, 2023


  • Show Abstract
  • Financial institutions invest heavily in physical and property security with the implementation of increasingly modern security devices, structural reinforcement works in treasuries and safe rooms and reinforcement of the surveillance system with the purpose of mitigating losses resulting from external thefts and fraud against agencies. However, it has been observed that the investment of resources in certain agencies, with a physical security bias, does not guarantee the same degree of relative efficiency in relation to other agencies that received less investment. In this way, an analysis of the minimum necessary investments will be carried out, with a focus on physical banking security, so that the operational risk, in the external event modality (external theft and fraud against the banking system), remains within the risk appetite established by the institution.  Note that in this study, only variables and investments that deal with physical and property security will be considered. The analysis will be in a Brazilian financial institution, through the modeling of Data Envelopment Analysis (DEA), of the investments in physical and patrimonial security, minimum necessary, in each branch, also called DMU, in order to verify the branches that present better relative efficiency compared to other agencies. For the application of DEA, 4 inputs were used (investment in physical and property security equipment, investment in infrastructure works, expenditure on surveillance and number of guards per branch) and 2 outputs (mitigated loss and inverse of the potential risk). Through this analysis, it is possible to verify the Regions, UF and DMUs that are efficient as well as to present possible reductions in security investment. It can be verified which efficient DMUs will act as a benchmark for the less efficient DMUs. Finally, the Tobit Regression was used with the purpose of verifying possible influences of indirect factors that can impact the efficiency scores.

2
  • João Laterza
  • Automated classification of complaints opened by customers and users of the National Financial System

  • Advisor : THIAGO DE PAULO FALEIROS
  • COMMITTEE MEMBERS :
  • GLAUCO VITOR PEDROSA
  • LUIS PAULO FAINA GARCIA
  • NÁDIA FELIX FELIPE DA SILVA
  • THIAGO DE PAULO FALEIROS
  • Data: Feb 10, 2023


  • Show Abstract
  • The Central Bank of Brazil (BCB) is responsible for assisting customers and users of the National Financial System (SFN) with complaints against products and services offered by its supervised entities and ensuring that all demands are appropriately addressed. Each concern is manually handled and classified as “proceeding” or “unfounded” based on a preliminary analysis of the facts described by the customer, the alleged entity’s reply, the attached documents, and according to its compliance with current regulations. However, dealing with an increasing demand with the available resources has turned out to be an unprecedented challenge, being currently impossible to handle all issues accordingly. In this context, the BCB has developed an automated solution to filter complaints more likely to be classified as “proceeding”, thus driving the human activities of examination, analysis and judgment. The present study aims to propose a new classifier for this task with better performance than the current model. To this end, deep learning approaches were explored, differing from the traditional method previously employed. As a result, an experimental classifier was tested, based on a hierarchical structure, and combining Bidirectional Encoder Representations from Transformers (BERT) with Bidirectional Long Short-Term Memory (BiLSTM) to generate unique contempt representations of both the citizen’s complaint and the entity’s reply. In cross-validation, this solution reached an average performance of 71.41%, based on the area under the precision-recall curve (PRAUC), exceeding the current BCB model by 0.96 percentage points, and performing 71.92% on the test dataset. If the proposed classifier was adopted to drive the task of handling complaints, we estimate that approximately 90.23% of the proceeding demands could be identified by evaluating only 60% of the total amount. In addition, multimodal strategies were tested to combine the described textual representations with tabular features of the original classifier. However, when compared to the previous proposed solution, the achieved gain with the multimodal strategies did not reach a statistically significant outcome.

3
  • Diego Marques de Azevedo
  • A probabilistically-oriented analysis of the performance of ASR systems for Brazilian radios and TVs

  • Advisor : GUILHERME SOUZA RODRIGUES
  • COMMITTEE MEMBERS :
  • ANDERSON DA SILVA SOARES
  • GLAUCO VITOR PEDROSA
  • GUILHERME SOUZA RODRIGUES
  • NELSON CRUZ SAMPAIO NETO
  • Data: Mar 3, 2023


  • Show Abstract
  • With the use of neural network-based technologies, Automatic Speech Recognition (ASR) systems for Brazilian Portuguese (BP) have shown great progress in the last few years. Several state-of-art results were achieved by open-source end-to-end models, such as the Kaldi toolkit and the Wav2vec 2.0. Alternative commercial tools are also available, including the Google and Microsoft speech-to-text APIs and the Audimus System of VoiceInteraction. We analyze the relative performance of such tools – in terms of the so-called Word Error Rate (WER) – when transcribing audio recordings from Brazilian radio and TV channels. A generalized linear model (GLM) is designed to stochastically describe the relationship between some of the audio’s properties (e.g. file format and audio duration) and the resulting WER, for each method under consideration. Among other uses, such strategy enables the analysis of local performances, indicating not only which tool performs better, but when
    exactly it is expected to do so. This, in turn, could be used to design an optimized system composed of several transcribers. The data generated for conducting this experiment and the scripts used to produce the stochastic model are public available.

4
  • Públio Pastrolin Cavalcante
  • Evaluation of the change in the quality of reports with the application of gamification in police intelligence activities

  • Advisor : SERGIO ANTONIO ANDRADE DE FREITAS
  • COMMITTEE MEMBERS :
  • ANDRE LUIZ PERON MARTINS LANNA
  • CHARLES ANDRYÊ GALVÃO MADEIRA
  • GEORGE MARSICANO CORREA
  • SERGIO ANTONIO ANDRADE DE FREITAS
  • Data: Mar 23, 2023


  • Show Abstract
  • The gamification of non-playful activities gained ground in the industrial, service, and educational environment, and the literature points out that by introducing game elements into an environment, activities can be more attractive, fun and more enjoyable. Combining game techniques to motivate and engage people has proven to be a powerful tool for managers. Military Police ́s from Federal District is the police corporation responsible for the ostensible policing and maintenance of public order in the federal capital. The Intelligence Activity is used to advisory on strategic, tactical, and operational decisions. This work presents the Gamified Intelligence Reporting Process (PGRI) to be utilized in Military Police Intelligence System. PGRI was constructed using Octalysis gamification framework base. Built using Intrinsic Motivation Inventory questionnaires and document analysis submited to statistical concepts. The PGRI pursue to improve the quality of intelligence reports with the application of gamification and is fed back by an iterative cycle of planning through the process evaluation results. It presents a literature review about the concepts of games and gamification, Intelligence Activities and Public Security and a summary about the Military Police of the Federal District.

5
  • José Nilo Alves de Sousa Neto
  • Estimation of Union Property Values Administered by the Brazilian Army with Machine Learning Techniques and Space Components

  • Advisor : MARCELO LADEIRA
  • COMMITTEE MEMBERS :
  • MARCELO LADEIRA
  • ARI MELO MARIANO
  • EDISON ISHIKAWA
  • BERNARDO ALVES FURTADO
  • Data: Apr 18, 2023


  • Show Abstract
  • The valuation of an institution’s patrimony represents a necessary condition for an efficient management of its assets. The execution and analysis of real estate appraisal reports areessential to the achievement of some strategic objectives of the Brazilian Army, but theyare also quite costly in terms of time, labor and financial resources. Sometimes, greateffort is required for the aforementioned steps to take place and the market value finally obtained is inconsistent with what was initially imagined by the authorities, causing the technical study carried out to not be effectively used in negotiations by the organization. In this sense, this work proposes the development of multilevel predictive models capable of building estimates of urban and rural real estate values. The models have a reasonable level of assertiveness and national geographic coverage when generate estimated market values of Union real estate assets. Intrinsic and extrinsic variables to the properties were considered, including tests of aggregation of spatial components on some of them. As the interpretability of the proposed solution is an important requirement, in both linear and nonlinear approaches, the Shapley value was adopted as a tool to support the guarantee of explainability. Partial least squares structural equation modeling (PLS-SEM) method was applied in order to select features in a reasoned and visually accessible manner. These two considerations associated with real estate price modeling at a national level represent an innovation of this work in relation to the analyzed scientific literature.

6
  • Rodrigo Pereira Gomes
  • Proposition of a Human Fatigue Risk Management computational tool for Air Traffic Controllers

  • Advisor : SIMONE BORGES SIMAO MONTEIRO
  • COMMITTEE MEMBERS :
  • SIMONE BORGES SIMAO MONTEIRO
  • ARI MELO MARIANO
  • EDGARD COSTA OLIVEIRA
  • REGIANE MÁXIMO SIQUEIRA
  • Data: Apr 24, 2023


  • Show Abstract
  • This work aims to propose a computational tool for Risk Management related to Human Fatigue, based on the Fatigue Risk Management System - Sistema de Gerenciamento do Risco à Fadiga (FRMS) approach, applied to Primeiro Centro Integrado de Defesa Aérea e Controle de Tráfego Aéreo (CINDACTA I), in order to minimize the risks related to fatigue. To this end, exploratory, descriptive, applied research was carried out, with a qualitative and quantitative approach. Documents from the Air Force Command, national and international legislation, application of a questionnaire, brainstorming with specialists in the areas of investigation and prevention of aeronautical accidents, and air traffic control were used as a data collection technique. As a result of this study, the perception of operators who work in operational bodies was collected. Based on the literature, the professionals answered a questionnaire with 44 objective questions directly related to the Human Fatigue (FH) theme, which can influence Operational Performance (OD). To tabulate the resulting data, partial least squares structural equation modeling (PLS-SEM) was applied, which revealed that according to the response of 149 operators, the conceptual model elaborated explains the operational performance in 34.7%, the dimension reaction time in 31.2% and state of alert in 29.9%. In the validation stage, the computational solution was presented to specialists and analysts, through meetings, aiming at validating the Sistema de Gerenciamento de Riscos da Fadiga (SGRF) and evaluating the feasibility of using the tool by the institution. The study revealed that, in order to improve the operational performance of air traffic controllers, actions should be taken regarding sleep quality and reaction time. Finally, a prototype of a computational tool developed, tested, and validated by the Organization’s specialists, called the Sistema de Gerenciamento de Riscos da Fadiga (SGRF), was presented, aiming to contemplate all the essential stages of the fatigue risk management process for the Air Traffic Control Officer (ATCO), according to the Fatigue Risk Management System (FRMS) approach.

7
  • Marcella Guarnieri Mercês
  • Employee turnover prediction for military officers in the Brazilian Army using machine learning techniques

  • Advisor : MARCIO DE CARVALHO VICTORINO
  • COMMITTEE MEMBERS :
  • EDISON ISHIKAWA
  • MARCIO DE CARVALHO VICTORINO
  • MARIA CLÁUDIA REIS CAVALCANTI
  • THIAGO DE PAULO FALEIROS
  • Data: Jun 22, 2023


  • Show Abstract
  • This research aims to analyse the turnover of Brazilian Army’s officers over the years and propose a set of techniques that allow the identification of those officers more likely to resign. This result can help the Brazilian Army to act so that these resignations have a lesser impact in it’s productivity. The aproaches chosen considered the most common techniques used to predict employee churn in other areas. Common classification algorithms, such as K Nearest Neighbors (KNN), Naive Bayes, Suport Vector Machines (SVM), Decision Trees and Random Forest were used. Techniques that could improve the first results found were also studied so that they can be applied in the continuation of this research.

8
  • EDMILSON COSME DA SILVA
  • Prediction of Academic Dropout in Higher Education: The Case of Face-to-Face Undergraduate Courses at the University of Brasília

  • Advisor : SERGIO ANTONIO ANDRADE DE FREITAS
  • COMMITTEE MEMBERS :
  • SERGIO ANTONIO ANDRADE DE FREITAS
  • ANDRE LUIZ PERON MARTINS LANNA
  • ANDREA FELIPPE CABELLO
  • RAFAEL FERREIRA LEITE DE MELLO
  • Data: Jul 3, 2023


  • Show Abstract
  • For some time now, (inter)national researchers have been studying dropout rates in higher education courses, classifying them into two types: students who drop out of the university, and students who drop out of higher education. Both situations cause damage to the institution, students, and society at large. Starting in 1995, with the creation of ANDIFES, studies began to be more frequent in Brazil. This commission developed reports that analyzed graduation rates, retention, and dropout rates in undergraduate courses at Brazilian universities. Institutional dropout was one of the study subjects, described as a student leaving their original course without completing it. The University of Brasília (UnB), considering the issues surrounding student dropout, has created mechanisms to increase student retention in undergraduate courses. The objective of this work was to develop and evaluate an analytical model that allows the use of academic data for predicting dropout rates in face-to-face undergraduate courses. A Systematic Literature Review was conducted to identify the factors that impact dropout rates and define indicators that can be extracted from UnB's academic systems. It also helped in selecting algorithms/tools to support the analysis. The main result of the systematic review was the identification of 29 factors used by researchers, where average score, gender, and course grades were the most commonly used ones. Regarding tools, Regression, Decision Tree, and Neural Network were the most frequently used algorithms. Based on this preliminary result, the Undergraduate Analysis Model (MAGRA) was created, which utilizes existing indicators in UnB's academic systems in conjunction with machine learning tools to predict students at risk of dropout. The research was conducted in an environment that encompasses two scenarios. The first was developed at the Faculty of Gama (FGA), which served as a prototype for MAGRA's creation, and the second at UnB. In the testing stages, where only the courses taken by students were considered, it was shown that the number of times a student takes a course can be an indicator of their difficulty in completing it within the designated time frame. By employing the new variables, there was an increase in the number of valid models and, consequently, an increase in the analyzed courses and classes, resulting in a higher number of predictions. This situation was observed during the studies conducted at the Faculty of Gama and the University of Brasília. To improve early identification of students with dropout characteristics, it is necessary to create feedback mechanisms from course coordinators, introduce new systems, improve data quality, and adjust algorithm parameters.

9
  • Amilton Lôbo Mendes Junior
  • The use of classification techniques applied to the profiling of workers in the National Employment System: a machine learning approach

  • Advisor : GLADSTON LUIZ DA SILVA
  • COMMITTEE MEMBERS :
  • FRANCISCO JOSÉ DA SILVA E SILVA
  • GLADSTON LUIZ DA SILVA
  • MARCELO LADEIRA
  • MARISTELA TERTO DE HOLANDA
  • Data: Jul 4, 2023


  • Show Abstract
  • The present work proposes the application of machine learning in the profiling of workers of brazilian public employment system, the National Employment System - NES ( Sine in portuguese ). The use of an automated mechanism of workers profiling will allow efforts to be directed to the preventive treatment of those workers most likely to remain outside the formal labor market longer. Accuracy and F1-Score were used as metrics in the evaluation of the proposed models. The model with the best results in the experiments was the Extreme Gradient Boosting. Future improvements may include the addition of features related to workers profile, to local labor market and to transitions of occupations, economic sectors and residence

10
  • Eric Hans Messias da Silva
  • Abstractive Summarization of Long Documents Used in Inspections and Procedural Instructions

  • Advisor : MARCELO LADEIRA
  • COMMITTEE MEMBERS :
  • ANDREI LIMA QUEIROZ
  • MARCELO LADEIRA
  • THIAGO ALEXANDRE SALGUEIRO PARDO
  • THIAGO DE PAULO FALEIROS
  • Data: Jul 13, 2023


  • Show Abstract
  • The Brazilian Federal Court of Accounts organizes its work by processes and, throughout their life cycle, each of them usually contains from tens to hundreds of legal documents. Each document easily reaches a few dozen pages. The number of processes and documents only tends to grow over time, which generates a huge amount of material for reading and with a very rich content, but difficult to consume, as it takes considerable time to read each process. The processes are usually read to verify if they have relevant content for any fiscalization or procedural instruction in progress. In addition to the high cost of reading a process, part of this content is discarded by the auditor because it is not linked to their current work, which generates a waste of time in this activity. To improve the efficiency of this process, we proposed in this work the development of an automatic text summarization solution using machine learning applied to natural language processing. This solution will use the abstractive summarization approach applied to long documents and with legal content, using models that are state-of-the-art in the task and based on transformers with linear attention mechanism. The solution will be made available as an Web Apllication with a microservice for better integration with applications that make up the auditor’s work process. The summaries generated by the models will be evaluated mainly by metrics that focus more on the semantics of the generated text and, as a result, will have a better adherence to the desired content. The user will provide feedback on the generated summaries and they will be used to feed back the model later.

11
  • Marcos Paulo Pereira da Silva
  • Digital Ecosystem Diagnosis: a case study
     


     
  • Advisor : REJANE MARIA DA COSTA FIGUEIREDO
  • COMMITTEE MEMBERS :
  • BRUNA DIIRR GONCALVES DA SILVA
  • EDNA DIAS CANEDO
  • ELAINE VENSON
  • REJANE MARIA DA COSTA FIGUEIREDO
  • Data: Jul 14, 2023


  • Show Abstract
  • The governments of several countries have sought to adapt to the digital world by employing digital transformation to increase the delivery of value to society. The Federal Court of Accounts – Brazil (TCU) approved its Digital Strategy in 2020, aiming to sway its operation in the digital world to increase the impact of control actions for society. Understanding and diagnosing the characteristics of the current state of their digital ecosystem and defining improvement guidelines are essential in TCU’s transformation journey. In this context, the objective of this work was to propose guidelines for a Digital Ecosystem in the context of digital transformation at TCU. The case study technique was used to diagnose a thematic area of the TCU digital strategy, with a qualitative data approach. The selected thematic area of TCU’s strategy, Federal Union Personnel Inspection and a specific assessment instrument was generated from the adaptation of the ecosystem assessment model, the SEG-M². This process and the diagnosis results were validated by the TCU through the focus group technique, allowing a portrait of TCU’s digital ecosystem. The adapted instrument enabled the identification of points of attention and possible improvements in the TCU ecosystem’s strategy in this thematic area. In future work, there is an opportunity to apply the process in the other thematic areas of TCU’s Digital Strategy to obtain a global vision of the digital ecosystem. Moreover, there is an opportunity to apply the instrument in other public bodies, allowing for the sharing of experiences and knowledge and contributing to the definition of an instrument adapted to the characteristics of Brazilian public organizations.

     
12
  • MICHAEL RODRIGUES DA SILVA
  • Reactive Cognitive Architecture based on microservices and micro frontends to improve the user experience in internet banking through Adaptive Interfaces

  • Advisor : LETICIA LOPES LEITE
  • COMMITTEE MEMBERS :
  • ANDRE LUIZ PERON MARTINS LANNA
  • EDISON ISHIKAWA
  • LETICIA LOPES LEITE
  • MÁRCIA CRISTINA MORAES
  • Data: Aug 1, 2023


  • Show Abstract
  • Advances in communication and computing technologies have boosted the use of the internet, affecting the life of society and playing an important role in business. The banking industry has followed this evolution, offering essential services through the internet to millions of customers. In addition, competition between banks that offer services through the internet has grown more and more. In this scenario, it is observed that the User Experience (UX) is an important factor in the perception of quality by the customers of these institutions, even influencing the choice between one bank and another. As the software architecture is an important element that affects usability, it is essential to ensure that usability and user experience criteria are supported when planning this architecture. Therefore, this work proposes an architecture for banking systems based on concepts of reactive systems, providing for integration with services and heterogeneous data sources to improve the UX, with an interface that adapts to the user based on their behavior and characteristics. For this reason, adaptive interfaces tend to offer a better user expe- rience, as they seek to adapt to the needs and desires of users. There are several ways to render adaptive applications, this work follows the micro-frontends model, as it gener- ates smaller, more cohesive and sustainable code bases, making it possible to work with autonomous teams and different technologies. These characteristics are important, espe- cially for larger corporations, which support larger groups of professionals and teams. The architecture proposed by this work uses the Reinforcement Learning algorithm, added to the Monte Carlo Tree Search (MCTS) algorithm and the Deep Learning technique, to create adaptive applications. Finally, a robust backend system has to be developed to extract, store and process user data, so the microservices’ architecture was adopted, which allows applications to be formed by small, cohesive and independent services.

13
  • Rogério Gabriel Nogalha de Lima
  • IT spending prioritization model in the context of budget constraint of budget constraint: an approach with
    PLS-SEM within the Ministry of Economy

  • Advisor : ARI MELO MARIANO
  • COMMITTEE MEMBERS :
  • ARI MELO MARIANO
  • JOAO GABRIEL DE MORAES SOUZA
  • REGIANE MÁXIMO SIQUEIRA
  • RICARDO MATOS CHAIM
  • Data: Aug 30, 2023


  • Show Abstract
  • This study aims to propose a framework for evaluating the prioritization of information technology expenditures in a budget-constrained scenario as an influencing factor in IT governance and management at the Ministry of Economy. A literature review was conducted, based on the Consolidated Meta-Analytic Focus Theory - TEMAC, followed by a survey that collected 112 responses from Ministry executives impacted by this budgetary restriction. The research model was computed using structural equations, and the results indicated its suitability for assessing the researched environment. In this study, the resulting model successfully accounted for 47.2% of Governance and Management, 34% of Technology utilization, and 11.3% of Investment Consistency, with Business Opportunity and e-Government (Electronic Government) identified as the most significant variables influencing prioritization in a budgetary and financial constraint environment.

14
  • DANIELA DE OLIVEIRA MORAES
  • RISK FACTORS IN THE USE OF INFORMATION SYSTEMS OF THE MINISTRY OF AGRICULTURE AND LIVESTOCK: A STUDY OF DIRECT AND MODERATING RELATIONSHIPS

  • Advisor : ARI MELO MARIANO
  • COMMITTEE MEMBERS :
  • ARI MELO MARIANO
  • DANIEL JUGEND
  • JOAO GABRIEL DE MORAES SOUZA
  • VIVIANE VASCONCELLOS FERREIRA GRUBISIC
  • Data: Aug 31, 2023


  • Show Abstract
  • The use of the information system brings constant benefits to those who use it, due to the increase in data generated. Grouping this information can be useful for daily activities in any work environment, and even in personal life. When talking about information system, we do not necessarily talk about computerized system, but organization, either through some automated platform or simply in a manual file. Along with the use of SI, there are risks that involve its development and that may affect its perceived usefulness. Thus, the objective of this dissertation was to propose steps to minimize the risk factors that impact the use of use of information systems delivered by the Undersecretariat of Information Technology of the Ministry of Agriculture and Livestock. Initially, it was important to search the literature for the risk variables that the authors understand to be most relevant. Subsequently, the structural equations model was applied to relate these variables and thus identify the factors with the greatest impact on the use of the IS. Regarding the practical implications, a questionnaire was first applied, where 71 responses were collected. The main results are: Perceived Utility influences the use of IS in 38.24% of cases; and Ease of Use influences Perceived Utility 15.21% of the time. In addition, the importance-performance map (IPMA) was generated to enrich the results of the PLS-SEM model and assist in the prioritization of the variables of greatest relevance and impact. In order of high importance and high performance, the indicators perceived utility and risks will be prioritized. Moreover, the moderation of the model revealed the variable Age as a point of attention because it is a variable of impact on the perceived usefulness of information systems.

15
  • Rafael Bruno Peccatiello
  • Detection of malicious behavior of internal users on networks using data flow analysis

  • Advisor : LUIS PAULO FAINA GARCIA
  • COMMITTEE MEMBERS :
  • LUIS PAULO FAINA GARCIA
  • MARCOS FAGUNDES CAETANO
  • SYLVIO BARBON JÚNIOR
  • THIAGO DE PAULO FALEIROS
  • Data: Sep 1, 2023


  • Show Abstract
  • The present study addresses the problem of detecting insider threats. An insider threat is anyone who has legitimate access to a particular organization's network and uses that access to harm that organization. The study presents a way of solving the problem through the proposition of a model that brings together different Data Science techniques. Among them are the use of supervised Machine Learning and the use of the data flow analysis approach. The latter was developed with the application of a way for the treatment of unbalanced data flows, since the unbalance is a variable present in the domain of the problem. The unbalance of the flow will be treated through propagation of minority samples for balanced composition of a sliding training window. The attributes and the way in which they were extracted come from sources present in the literature. The algorithms used were Random Forest (RF), Light Gradient Boosting Machine (LGBM) and K-Nearest Neighborhoods (KNN). The performance of the model was evaluated according to the values obtained by the metrics precision, recall, F1-Score and kappa. The results obtained show that the treatment of unbalance allows achieving superior performance to those obtained in the literature, for insider threat detection.

16
  • William Oliveira Camelo
  • The effect of climate externalities on credit risk

  • Advisor : JOAO GABRIEL DE MORAES SOUZA
  • COMMITTEE MEMBERS :
  • JOAO CARLOS FELIX SOUZA
  • JOAO GABRIEL DE MORAES SOUZA
  • MATHIAS SCHNEID TESSMANN
  • PENG YAOHAO
  • Data: Sep 14, 2023


  • Show Abstract
  • Investments in financial operations can generate impacts on the social, economic, and environmental sectors that were not incorporated into the final price of their contracts. Such impacts are called externalities and it is understood that they can produce positive or adverse effects. The analysis of the credit risk associated with the socio-environmental concerns the possibility of economic activities generating socio-environmental impacts or damages capable of generating financial losses that compromise the ability to pay the financing, as well as the co-responsibility of the financial agent for the recovery of socio-environmental damage. In the case of climate risk, the golf venture is directly or indirectly exposed to extreme weather events capable of jeopardizing its cash flow and ability to pay to finance. Thus, there was a need for measurement instruments capable of evaluating the behavior of externalities in relation to the credit risk of operations. The objective of this study is to assess the impact of incorporating climate risk measurement into financial operations in the strategic process of providing credit for operations. For this, a measurement of the climate risk was carried out and a methodology was applied —— to assess the importance of the variables that compose it and infer their ability to influence the credit analysis. As a result of the present study, measurement of climate risk of operations and assessment of climate risk in relation to variables were provided.

17
  • Daniel Ferreira Schulz
  • Failure Detection in a Banking Mobile Application

  • Advisor : ALETEIA PATRICIA FAVACHO DE ARAUJO VON PAUMGARTTEN
  • COMMITTEE MEMBERS :
  • ALETEIA PATRICIA FAVACHO DE ARAUJO VON PAUMGARTTEN
  • GERALDO PEREIRA ROCHA FILHO
  • THIAGO DE PAULO FALEIROS
  • ALAN DEMÉTRIUS BARIA VALEJO
  • Data: Nov 10, 2023


  • Show Abstract
  • High availability is an increasingly important requirement in IT systems. One of the strategies implemented to achieve a stable environment is the continuous monitoring of services, as described by ITIL. Given the above, this work proposes a failure detection approach through data mining techniques. The approach was modeled using the CRISPDM reference model. The trained models used data extracted from a Web Analytics tool that stores user interactions with a mobile banking application. The effects of different attribute engineering techniques, such as variable filtering, data standardization and generation of synthetic samples, were also evaluated. Finally, the results were compared between seven algorithms, and the support vector machine was the one that obtained the best result, with an F1-Score of 0.954 and a ROC-AUC of 0.989

18
  • Luís Augusto Vieira Ribeiro
  • Application of Multicriteria Decision Analysis and Optimization in Decision Making: A Study of the Information Technology Demands in the Government  Federal  

  • Advisor : JOAO CARLOS FELIX SOUZA
  • COMMITTEE MEMBERS :
  • JOAO CARLOS FELIX SOUZA
  • ARI MELO MARIANO
  • JOAO GABRIEL DE MORAES SOUZA
  • Lena Lúcia de Moraes
  • Data: Nov 14, 2023


  • Show Abstract
  • In an increasingly dynamic environment, the speed of decision-making has become a differential, especially in a scenario that has many obstacles, such as the government. Interests from different spheres (municipal, state and federal) and different actors (political parties, benches, mobilization of popular groups), must be consolidated in the most assertive decision for society. Thus, methods that guarantee the decrease of subjectivity and increase the consolidation of different points of view are necessary. Thus, the objective of this study is to demonstrate a decision-making model through the use of the hierarchical analytical process (AHP) in partnership with the Promethee II method, applying optimization through Risk Simulator to prioritize the information technology demands of the Ministry Federal Public. This is an exploratory research with a qualitative-quantitative approach using multicriteria analysis and optimization. As a preliminary result of the study, the criteria Support, Strategic and Interested Parties were established, with the respective approximate weights of 65%, 28% and 7%. Eight alternatives were defined, where the most prominent was "Technical knowledge of the ICT team" with an approximate weight of 24.5%. The combination of criteria and alternatives applied during the research resulted in the demand for Security being the highest priority and the demand for BI being the lowest priority among the 12 selected. It is also up to the computational application of the Risk Simulator tool to carry out the optimized prioritization according to restrictions defined by senior management.

19
  • Lucilene da Silva Leite
  • Proposal for improving the Startups Selection Process in Public Procurement of Innovative Solutions (PPI) through Risk Management

  • Advisor : SIMONE BORGES SIMAO MONTEIRO
  • COMMITTEE MEMBERS :
  • SIMONE BORGES SIMAO MONTEIRO
  • ARI MELO MARIANO
  • JOHN LENON CARDOSO GARDENGHI
  • ADRIANA REGINA MARTIN
  • Data: Dec 18, 2023


  • Show Abstract
  • The present research focused on the risk management of Public Procurement of Innovative Solutions, specifically of a Brazilian public financial institution. Nowadays, it is rare to observe organizations implementing differentiated risk assessment methods with specific characteristics and a high degree of uncertainty as the public procurements of innovative solutions. Thus, the absence of a particular risk management model to be applied to the procurement of innovative solutions is a gap that must be addressed by public institutions in Brazilian Innovation Ecosystem. Therefore, it was necessary to carry out a study to understand how risk management can be better integrated with innovation management and how it can be applied more adequately to the procurement of innovative solutions in Open Innovation Journeys, to support strategic decision-making and gradually reduce unsuccessful procurement processes. The partial results brought by the case study and the current process (AS-IS) showed that startups are evaluated with the same criteria as other large and mediumsized corporations and companies; that is, the applied evaluation does not consider the size of the solution provider or service provider, nor the nature of the contracted services. In addition, the risk identification process showed that the institution has suffered the consequences of risks materialized, such as the procurement does not bring the expected technological efficiency due to the loss of opportunity to develop an innovative solution through a partnership with a more prepared and mature startup, which was disqualified during the institution's standard risk assessment.

20
  • José Anderson de Freitas Silva
  • Robotic Automation of Processes in a Federal Educational Institution: A Case study

  • Advisor : REJANE MARIA DA COSTA FIGUEIREDO
  • COMMITTEE MEMBERS :
  • REJANE MARIA DA COSTA FIGUEIREDO
  • ANDRE BARROS DE SALES
  • EDNA DIAS CANEDO
  • GLAUCO HENRIQUE DE SOUSA MENDES
  • Data: Dec 18, 2023


  • Show Abstract
  • Digital transformation encompasses the use of technologies and their impact on organization’s business models, resulting in changes in products, organizational structures or process automation. The adoption of innovative resources can contribute to the improvement of work processes, especially regarding the reduction of repetitive tasks, errors and recurring problems. In this context, digitization is related to business development, with processes being supported by digital solutions or functions; while process automation is applied with the use of digital technologies to increase efficiency, reducing steps in workflows and enabling the automation of repetitive human tasks. The people management area of the Federal Institute of Education, Science and Technology of Brasília (IFB) faces the challenges of modernizing the traditional methods used in its processes management. The area uses electronic means to carry out administrative processes, in addition to structuring technological platforms. The plurality of solutions with low or no integration causes repetitive activities of low complexity, with excessive consumption of time. The objective of this work is to define a digital transformation approach for repetitive processes in the people management area of the IFB. The adopted research methodology is descriptive with the use of the Case Study technique. The approach will be based on Robotic Process Automation, an emerging technology for process automation, which was selected from the execution of a systematic literature review to identify techniques, tools, solutions and implementations used in digitalization with emphasis on process automation

     
21
  • Priscilla Souza Ramos Alves
  • Project risk management: an agile approach applied to a Federal Agency

  • Advisor : SIMONE BORGES SIMAO MONTEIRO
  • COMMITTEE MEMBERS :
  • SIMONE BORGES SIMAO MONTEIRO
  • ARI MELO MARIANO
  • Gustavo Jardim Portella
  • VIVIANE VASCONCELLOS FERREIRA GRUBISIC
  • Data: Dec 19, 2023


  • Show Abstract
  • This paper aims to present a framework based on the agile methodology for the management of IT project risk that can be applied to a federal public administration body. For this study to be possible, an exploratory research was conducted, with the case study strategy, of an applied nature, with qualitative and quantitative approach. The techniques used for data collection were based on documentary research with database extraction, structured questionnaire application, expert interviews and brainstorming with experts in the area. This research proposes the development of an agile risk management framework called CRISK that contributes to decision making in the scope of IT project management focused on risk management using visual methods for this management. To this end, the present work presents a literature review on the subject in order to identify best practices, norms and techniques, the contextualization of risk management in the researched organ and the proposition of framework for risk management, aiming at risk management of agile, visual and collaborative design for IT projects in the researched organ.

22
  • Wagner Miranda Costa
  • Semantic Similarity between Judgments to Support the Formulation of TCU Jurisprudence

  • Advisor : GLAUCO VITOR PEDROSA
  • COMMITTEE MEMBERS :
  • BRUNO CESAR RIBAS
  • EDUARDO DE PAULA COSTA
  • GLAUCO VITOR PEDROSA
  • THIAGO DE PAULO FALEIROS
  • Data: Dec 21, 2023
    Ata de defesa assinada:


  • Show Abstract
  • Jurisprudence refers to the set of repeated decisions on a given subject, constituting a type of judicial precedent. Within the scope of the Federal Audit Court (TCU), the body responsible for exercising external control of the Federal Public Administration, jurisprudence represents the consolidated interpretations of the rules applicable to the financial and operational supervision of the public accounts of the Union’s bodies and entities. Since the elaboration of jurisprudence is defined based on a grouping of similar rulings, it is important to develop automated tools that assist the specialists responsible for this activity. However, this is a challenging task for the area of computing, due to the specificities of the vocabulary present in the texts of the rulings and the massive volume of data to be processed. Therefore, it is necessary to develop scalable, effective and efficient approaches that have low computational cost. This work presents the study and implementation of some approaches for representing these textual documents, both at the word level and at the concept level. As a contribution, a new approach called BoC-Th (Bag of Concepts with Thesaurus) was proposed, which generates weighted histograms of concepts defined based on the distance of the words in the document to their respective similar term within a thesaurus. This approach allows us to emphasize words with greater meaning in the context, thus generating more discriminative vectors. Experimental evaluations were carried out comparing the proposed approach with traditional approaches for document representation. The proposed method obtained superior results among the techniques evaluated for recovering jurisprudential documents. BoC-Th increased average accuracy compared to traditional approaches, including the original BoC (Bag of Concepts), while also being faster than traditional BoW, BM25, and TF-IDF representations. The proposed approach contributed to enriching an area with peculiar characteristics, providing a resource for retrieving textual information more accurately and quickly than other techniques based on natural language processing.

23
  • Luciana Santos de Assis
  • Gamification in Organizational Workplaces: A Case Study in a Public Company

  • Advisor : SERGIO ANTONIO ANDRADE DE FREITAS
  • COMMITTEE MEMBERS :
  • ANDRE LUIZ PERON MARTINS LANNA
  • CHARLES ANDRYÊ GALVÃO MADEIRA
  • SERGIO ANTONIO ANDRADE DE FREITAS
  • TIAGO BARROS PONTES E SILVA
  • Data: Dec 21, 2023


  • Show Abstract
  • Gamification is a way of using game elements and applying them in real-world contexts to motivate people and achieve a more productive environment. As it is considered a tool designed to increase engagement, gamification can be applied in several areas, including organizational activities. The purpose of this work consists of carrying out a Case Study at the Brazilian Agricultural Research Corporation (Embrapa) through the analysis of the effects of a gamified solution applied to the Innovation Macroprocess (MPI), responsible for managing research, development and innovation solutions for the sustainability of Brazilian agribusiness. The main objective is to evaluate the effectiveness of gamification to increase employee engagement, collaboration and motivation regarding the use of the Asset Management System (Gestec), a computational tool that implements part of the MPI. Initially, a study of the theoretical foundations of gamification was carried out, followed by a Systematic Literature Review (SLR) to investigate the main frameworks, techniques and effects observed in gamified solutions applied in organizational environments. Afterwards, a Case Study was carried out, involving a group of twenty seven participants to evaluate the effects that a gamification solution can provide.

2022
Dissertations
1
  • Javan de Oliveira Cruz
  • An Information Architecture for the Organization of Field Event Information in Military Operations of the Brazilian Army with the Use of Ontologies

  • Advisor : MARCIO DE CARVALHO VICTORINO
  • COMMITTEE MEMBERS :
  • EDISON ISHIKAWA
  • MARCIO DE CARVALHO VICTORINO
  • MARISTELA TERTO DE HOLANDA
  • WALLACE ANACLETO PINHEIRO
  • Data: Aug 3, 2022


  • Show Abstract
  • The Centro de Coordenação de Operações Móvel (CCOp Mv) is a strategic project of the Brazilian Army that is under development, with the aim to support a Large Operational Command in war and non-war situations. CCOp Mv will be composed of different technologies, systems and resources, in order to achieve its objective. Among its main functionalities can be highlighted the acquisition and provision of information of field events, which in addition to directing the construction of situational awareness, represent the essential parameters for the decision-making process of authorities, in military operations. Despite the technological advances experienced today, the nature of current conflicts allows for the perception of the relevance of obtaining data through human observers. However, simply obtaining data is not considered sufficient for the formation of specific knowledge about a given domain, requiring the construction of a structured information base, which allows the recovery of knowledge in an optimized way. In this context, the proposal of this research is to provide an Information Architecture aimed at obtaining data through field observers, in addition to providing generated knowledge, having as parameters the organization and informational reports structuring, employing ontologies for its semantic representation. Furthermore, the construction of an ontology prototype is being proposed, in order to support the developed information architecture, since the domain of military events has a considerable specificity. As contributions can be highlighted the provision of an Information Architecture, which aims to assist in the optimization of obtaining information in military operations, as well as the prototype of ontology that was developed, which could be used as a basis for the development of other applications within the Brazilian Army.

2
  • Ingrid Palma Araújo
  • Decision Support Model to Evaluate Open Government Data from the Electricity Sector

  • Advisor : ANA CARLA BITTENCOURT REIS
  • COMMITTEE MEMBERS :
  • ALTINO JOSÉ MENTZINGEN DE MORAES
  • ANA CARLA BITTENCOURT REIS
  • ARI MELO MARIANO
  • PATRICIO ESTEBAN RAMIREZ CORREA
  • Data: Aug 5, 2022


  • Show Abstract
  • Making more assertive and efficient decisions in the face of scarce public resources while considering all potential alternatives has become one of the most common problems for managers responsible for open government data. Open data ecosystems around the world are introducing guidelines and targets with a focus on electric power, given the growing awareness of topics such as the water crisis, climate change, renewable resources, and initiatives to increase energy efficiency2 . Moreover, the subjectivity and imprecision in the process of opening data from this sector can make this task even more complex, especially when there are no specific measures to support decision-making. Thus, this study proposes to simplify and automate this process, combining two different methods of decision support through multicriteria analysis in a model capable of assessing and prioritizing risk criteria in the context of Open Data, presenting the results via iterative online dashboards developed in R3 . The methodology followed combines the AHP and TOPSIS-2N methods, creating a ranking of the open governmental dataset of the electricity sector in light of the risk criteria evaluated by the proposed model. The AHP technique was used to specify and normalize the importance of each criterion, considering the consistency aspects of the decision matrix. The next step was to apply the TOPSIS-2N method to sort and prioritize these datasets. The results present the datasets that should be improved concerning the respective metadata and the prioritized topics to make the decision-making for the management of these bases more agile and assertive. Different types of metadata are used to measure the 5 identified risk criteria. The proposed model is useful not only for managers responsible for decisions involving the contribution of resources to improve the datasets already available but also to prioritize the most relevant topics for data opening. Open data related to power sector planning (BD46 and BD45) and energy tax (BD47) stood outon the top of the model, inferring potential added value generated by these bases.

3
  • Murilo Góes de Almeida
  • Implementation of the OAuth2 Protocol and OpenID Connect in a Microservices Oriented Architecture

  • Advisor : EDNA DIAS CANEDO
  • COMMITTEE MEMBERS :
  • EDNA DIAS CANEDO
  • EDUARDO LUZEIRO FEITOSA
  • RODRIGO BONIFACIO DE ALMEIDA
  • SERGIO ANTONIO ANDRADE DE FREITAS
  • Data: Aug 29, 2022


  • Show Abstract
  • The Intelligence Department of the Military Police of São Paulo State (CIPM/PMESP) has been evolving their legacy systems, which are currently in a microservices architecture. This architecture splits an application into small services, which are implemented independently, with their own deployment unit. The microservice architecture can bring benefits, however, it also presents challenges, especially regarding security aspects. Therefore, it brings a need to explore knowledge about security issues in microservices, especially in authentication and authorization aspects. In addition, the characteristic of the intelligence activity must necessarily reach a series of principles, such as compartmentalization, which aims to restrict access to certain information only to professionals who need to know. This reinforces the need to verify whether the current environment is adequate in terms of authentication and authorization, to ensure compliance with this principle. This work aims to propose and evaluate the security of a microservices-oriented solution to perform authentication and authorization in the Intelligence Systems of the Military Police of the State of São Paulo. To this end, a Systematic Literature Review is carried out to check what are the main challenges in authentication and authorization in the microservice architecture, the mechanisms that deal with such challenges and the open-source technologies that implement such mechanisms. With the SLR findings, a validation survey is carried out, verifying whether such findings are observed in the industry, to confirm the practical use of what was found, in addition to seeking to verify new answers that were not found in the SLR. Finally, an implementation is carried out in the PMESP intelligence environment with the open-source technology that implements the mechanisms found, applying security tests in the current and new environment, to verify if the applied mechanisms improved the security of the application.

4
  • Nivando Araujo Cavalcante
  • A Business Model to Provide Interoperability between Agencies in Operations in an Interagency Environment

  • Advisor : MARCELO ANTONIO MAROTTA
  • COMMITTEE MEMBERS :
  • EDISON ISHIKAWA
  • MARCELO ANTONIO MAROTTA
  • MARCIO DE CARVALHO VICTORINO
  • MARIA CLÁUDIA REIS CAVALCANTI
  • Data: Sep 15, 2022


  • Show Abstract
  • Operations in Interagency Environments with the participation of the Armed Forces (AF) and civilian agencies are increasingly frequent. In this context, in order to achieve success, it is necessary that there is interoperability between the agencies involved with the sharing of information in order to achieve the common objective of the operation. However, to make this integration possible, several barriers need to be overcome, which reduces the interest of institutions in sharing their data. Despite the existence of several studies in the literature on the subject, there are still gaps to be addressed. Among them, there is the need to create a business model that allows for a greater scope, in terms of participants and data, and that uses a strategy to encourage agencies to share their information. In this work, a new business model is proposed, based on an architecture that has as its central element an API Gateway and, among others, the Pricing and Financial Control module. The main objective is to provide the necessary interoperability for the monetized sharing of information between the entities involved, directly or indirectly, in Interagency Operations with the participation of military and civilian agencies. It is also intended to enable the sharing of data to be carried out in a comprehensive way and to be a tool to encourage integration, in order to alleviate the obstacles generated with the expenditure of resources to make the data available. The incentive will be provided with the implementation of monetization whenever sharing occurs between entities. To validate the proposed model, a prototype was implemented and tests were carried out in a controlled environment. The results obtained with these tests were analyzed in order to verify the correct execution of the functionalities foreseen in the conceptual model. The performance of the prototype during the execution was also analyzed. Finally, an analysis was carried out regarding the economic viability of the proposed model.

5
  • Daniele Adriana Goulart Lopes
  • Botnet detection based on network flows using inverse statistics

  • Advisor : JOAO JOSE COSTA GONDIM
  • COMMITTEE MEMBERS :
  • JOAO JOSE COSTA GONDIM
  • LUIS PAULO FAINA GARCIA
  • Camila Ferreira Thé Pontes
  • LUCAS BONDAN
  • Data: Sep 22, 2022


  • Show Abstract
  • otnet is a network of infected computers, which are remotely controlled by a cyber-
    criminal, called botmaster, whose objective is to carry out massive cyber attacks, such
    as DDoS, SPAM and information theft. Traditional botnet detection methods, usually
    signature-based, are unable to detect unknown botnets. Behavior-based analytics has held
    promise for detecting current botnet trends, which are constantly evolving. Considering
    that Botnet attacks on the IT infrastructure of the Brazilian Army’s Mobile Operations
    Coordination Center (CCOp Mv) may harm the success of operations, through theft of
    sensitive information or even causing interruption to critical CCOp Mv systems, this dis-
    sertation proposes a botnet detection mechanism based on network flow behavior analisys.
    The main objective is to propose an additional layer of cyber protection to the CCOp
    Mv IT infrastructure. The technique used to detect botnets was recently developed and
    it is called Energy-based Flow Classifier (EFC). This technique uses inverse statistics to
    detect anomalies and has an important characteristic which is its easy adaptation to new
    domains. Due to this characteristic, EFC is a promising technique for detecting unknown
    botnets. EFC uses only benign data to infer the detection model and classifies as mali-
    cious any flow that deviates from the normal traffic pattern learned during model training.
    Thus, in addition to flows related to botnet activities, other types of malicious activities
    may be detected, making further verification necessary to identify the type of malicious
    activity detected. Two heterogeneous datasets, CTU-13 and ISOT HTTP were used to
    evaluate the efficiency of the model and the results were compared with several traditional
    algorithms. Preliminary results show that EFC presented good results when tested in the
    same domain and the tests performed in different domains show that the EFC manages
    to maintain stable results, regardless of the domain, unlike the other models tested.

6
  • Yuri Rodrigues Fialho
  • Organizational Interoperability: An Approach with Knowledge-Intensive Processes Ontology and Multi-Agent Systems

  • Advisor : EDISON ISHIKAWA
  • COMMITTEE MEMBERS :
  • RICARDO CHOREN NOYA
  • EDISON ISHIKAWA
  • LETICIA LOPES LEITE
  • MARCIO DE CARVALHO VICTORINO
  • Data: Sep 27, 2022


  • Show Abstract
  • Organizational interoperability is important for organizations to be able to work together and collaboratively to achieve a common goal. Interoperability at this level involves the integration of different agencies, with acquisition, sharing, storage and use of knowledge that normally depend on their performers. Thus, processes at this level are considered Knowledge-Intensive Processes (KIP). Studies at this level of interoperability are restricted to structured processes, with a lack of studies that approach unstructured processes. This work followed the Design Science Research methodology, focusing on solving this research gap, using Knowledge-intensive Processes Ontology and Multi-Agent Systems. For this, the Motirõ model is presented, which is capable of establishing interoperability at all levels, also meeting the KIPs and processes structured in dynamic and volatile contexts. The evaluation was carried out in three different scenarios, one of them being the Belo Horizonte City Contingency Plan, in which a questionnaire with the characteristics of the Maturity Model for Enterprise Interoperability was applied to verify the ability of the model prototype to perform operational interoperability. Finally, after carrying out the evaluation, it was concluded that the model is able to achieve organizational interoperability in dynamic and volatile contexts, in addition to supporting PIC executors in decision making.

7
  • Marcos Francisco da Silva
  • MFOG: An Architecture in Fog for the CCOp Mv Application Infrastructure

  • Advisor : ALETEIA PATRICIA FAVACHO DE ARAUJO VON PAUMGARTTEN
  • COMMITTEE MEMBERS :
  • ALETEIA PATRICIA FAVACHO DE ARAUJO VON PAUMGARTTEN
  • ALEXANDRE DA COSTA SENA
  • GERALDO PEREIRA ROCHA FILHO
  • MARCELO ANTONIO MAROTTA
  • Data: Nov 4, 2022


  • Show Abstract
  • In disaster response operations, a great deal of coordination of various agents and institutions is required. The use of tactical and operational software is of great importance for the success of the operation. However, operations can take place in adverse locations with scarce Internet access resources. This limitation can generate problems in the provision of services, since such applications use remote processing, which is typically performed a distant datacenter. In this context, this research project proposes an architecture that explores the concepts of fog computing called MFog, with a focus on application resilience, this architecture can be used in places where, despite the existence of structures with computational resources close to the end user, access to the Internet is limited (similar to what happens in these operations). For this, minimal architecture components and orchestration mechanisms are defined based on container application migration models. The results obtained from a case study indicate that the proposed architecture can be resilient to the problems faced in the scenarios of these operations.

8
  • Tiago Pereira Vidigal
  • When Technical Solutions are not Enough: Analysing Challenges at Delivery in Mixed-Signal projects inside Design Houses - Software Engineering Research Perspective

  • Advisor : CARLA SILVA ROCHA AGUIAR
  • COMMITTEE MEMBERS :
  • CARLA SILVA ROCHA AGUIAR
  • GEORGE MARSICANO CORREA
  • RENATO CORAL SAMPAIO
  • PAULO ROBERTO MIRANDA MEIRELLES
  • Data: Dec 12, 2022


  • Show Abstract
  • Research on Software Engineering is evolving in the socio-technical analysis and research methods to address the challenges in the industry. Other technology sectors that do not present much research on this topic, like integrated circuits, could benefit from those studies. This work is a case study to identify challenges in delivery processes on mixed-signal projects at a specific design house, considering both technical and social perspectives. We performed 17 semi-structured interviews and examined the available process documents.We analyzed the data using Socio-Technical Grounded Theory. We mapped 14 constructs, 13 propositions, and two explanations to identify those challenges faced by mixed-signal teams. This study discusses the social aspects of Semiconductor’s research, and we propose a list of possible delivery challenges in the context of the focal company. The suggested root causes of challenges in mixed signal projects are the incipient processes specification and the lack of practices related to collaboration in cross-functional teams. Requirement Engineering and DevOps studies provide insights into the organization's implications for improvement.


SIGAA | Secretaria de Tecnologia da Informação - STI - (61) 3107-0102 | Copyright © 2006-2024 - UFRN - app32_Prod.sigaa26