|9:00 – 9:20||Documentary Video on the University of Bahrain||Teams Channel|
|9:20 – 9:23||Opening Speech – President Video|
|9:23 – 9:25||Opening Speech – PGSR Video|
|9:25 – 9:30||Opening Speech – Steering Committee Video|
|9:30 – 9:40||Break|
|9:40 – 11:00||Technical sessions 1||Teams Channel|
Tracks (S1A to S1I)
|11:00 – 11:10||Break|
|11:10 – 12:30||Technical sessions 2||Teams Channel|
Tracks (S2A to S2I)
|12:30 – 14:00||Break|
|14:00 – 15:20||Technical sessions 3||Teams Channel|
Tracks (S3A to S3I)
|15:20 – 15:30||Break|
|15:30 – 16:50||Technical sessions 4||Teams Channel|
Tracks (S4A to S4I)
|16:50 – 17:00||Break|
|17:00 – 17:30||Plenaries 1||Teams Channel|
|17:30 – 17:40||Break|
|17:40 – 18:00||INFORMS-BH Group meeting||Teams Channel|
|9:30 – 10:00||Plenaries 2||Teams Channel|
|10:00 – 10:10||Break|
|10:10 – 11:30||Technical sessions 5||Teams Channel|
Tracks (S5A to S5H)
|11:30 – 11:40||Break|
|11:40 – 13:00||Technical sessions 6||Teams Channel|
Tracks (S6A to S6H)
|13:00 – 14:30||Break|
|14:30 – 15:50||Technical sessions 7||Teams Channel|
Tracks (S7A to S7H)
|15:50 – 16:00||Break|
|16:00 – 16:30||Plenaries 3||Teams Channel|
|16:30 – 16:40||Break|
|16:40 – 17:00||Closing Session and Closing Remarks||Teams Channel|
Sunday, November 8
S1B: Medical Decision Making 1
9:40 Towards an Intelligent Decision Making of Ti-based Powders Selection for Medical Manufacturing
Ivan Izonin and Tetiana Tepla (Lviv Polytechnic National University, Ukraine); Dmytro Danylyuk (Danylo Halytsky Lviv National Medical University, Ukraine); Roman Tkachenko, Zoia Duriagina and Ihor Lemishka (Lviv Polytechnic National University, Ukraine)
Decision making process is an important task in many industries. The design of the intelligent decision support systems is of particular importance in areas where the final decision is difficult to make by the operator, which significantly affects both material costs and human health. The paper investigates the possibility of using the artificial intelligence tools in Materials Science. The authors solved the problem of choosing the optimal class of Ti-based powders with a given fractional composition and properties for the manufacture of medical implants by 3D printing. For this purpose, based on the formed data sample, the efficiency of application of a number of existing and improved machine learning algorithms is investigated. Experimental evaluations of their efficiency for different indicators of accuracy and also speed of its work are presented. The application of the optimal machine learning method to solve the problem is determined and substantiated.
10:00 Assessing the COVID-19 Performance Indicators Used in the Portuguese Daily Situation Report
Organizations have been using performance indicators to, amongst others, support decision-making. However, as reported in the literature, there are obstacles to its design and use. This work analysis the Portuguese COVID-19 daily situation report and identifies the 18 performance indicators present in the four page report. Then, it recognizes the more relevant problems associated with its definition and communication, such as, uncertainty of data quality, too many indicators, lack of human resources attached to performance measurement, time and difficulty doing data analysis and lack of time to do data collection, which may limit its use to support decision-making. It ends presenting the requirements that should be fulfilled to reduce the identified problems.
10:20 A Noninvasive Approach Using Multi-tier Deep Learning Classifier for the Detection and Classification of Breast Neoplasm Based on the Staging of Tumor Growth
It is estimated on statistics that one out of eight females is affected by breast cancer worldwide. The scope of this work is to formulate a clinical protocol to find early-stage breast cancer and staging non-invasively. This work is to differentiate benign and malignant tumors with a novel ensemble approach. The ensemble utilization of signal processing by a recurrent neural network (RNN) and image processing by deep convolution neural network (DCNN) for the characterization of breast cancer yields the best result in disease prognosis. DCNN based system model named AlexNet is used to effectively classify breast neoplasm with optimum results in the mammogram dataset. In DCNN, the last layer called fully connected (FC) layer is linked to the support vector machine (SVM) classifier. The classifier fusion technique is adopted to combine the result of both imaging and signal processing to obtain the best classification result. Once it classifies with a greater value of true positivity it continues to go with Raman spectroscopy for the identification of spectral features associated with cells and tissues such as DNA, carbohydrate, nucleic acid, lipids, and proteins during the formation of breast neoplasm in the suspected candidate’s blood plasma samples. Long Short-Term Memory (LSTM) based RNN is utilized to classify breast neoplasm features from the spectral dataset. After the classification, a combination of principal component analysis with factorial discriminant analysis (PCA-FDA) is used to find the different stages of cancer growth. This method promisingly dealt to have better specificity and sensitivity for all stages.
10:40 Prediction and Classification of Rheumatoid Arthritis using Ensemble Machine Learning Approaches
Shanmugam Sundaramurthy (Anna University & Kongunadu College of Engineering and Technology, Thottiam, Trichy, India); Saravanabhavan C (Kongunadu College of Engineering and Technology, India); Pravin Kshirsagar (AVN Institute of Technology, India)
Rheumatoid arthritis (RA) is measured as an auto-immune illness that affects the musculoskeletal system causing inflammatory, systematic, and chronic effects. RA is generally progressive and diminishes the physical functionality that leads to articular and fatigue damages. Overall, RA harms bone and joint cartilage, weakening muscle joints, and destructing joints. In this investigation, medical disorder classification based on RA is done with Ensemble methods. Real-time RA data has been collected from the Sakthi Rheumatology center that holds 1000 patient profiles (750-RA affected and 250 non-affected profiles). This dataset is posed with a classification problem with numerous numerical features. Three ensemble algorithms, like SVM, Ada-boosting, and random sub-space, were used in this investigation. These ensemble classifiers use k-NN and Random forest for baseline measurements of the classifier. Data classification is performed with 10-fold cross-validation, in which evaluation is done with performance metrics like Accuracy, Precision, and ROC. The values of these metrics were compared with baseline algorithms and various ensemble classifiers. This optimality specifies the efficiency of base classifiers with ensemble classifiers, which provides substantial improvement.
S1C: Computerized Decision Aid 1
9:40 Empirical Method of Evaluating the Numerical Values of Metrics in the Process of Medical Software Quality Determination
Vasyl Sheketa and Valentyn Zorin (Ivano-Frankivsk National Technical University of Oil and Gas, Ukraine); Svitlana Chupakhina and Nataliia Kyrsta (Vasyl Stefanyk Precarpathian National University, Ukraine); Mykola Pasieka (Ivano-Frankivsk National Technical University of Oil and Gas & Ukraine, Ukraine); Nadia Pasieka (Vasyl Stefanyk Precarpathian National University, Ukraine)
At the state the method of scoring of an anchor program failure through singular numerical characteristics is realized. To this end it is possible to take a legalized measure of special medical program protection and possible problem aspects in the group. An empirical method is described, which allows to calculate the corresponding characteristic metrics, to estimate the value of the general index of medical software quality and to determine its critical points of use. On the basis of these results, it is proposed to work with expert consultants (solutions, innovations), and to solve the problems that may arise. All formulas are based on the description of ISO/IEC 25010.
10:06 Deep Convolutional Neural Network based Feature Extraction with Optimized Machine Learning Classifier in Infant Cry Classification
Crying is the only mode of communication for babies to share information with the surrounding environment. Expert knowledge is required to analyze the audio signals in pre-processing and extract the features from it. Deep learning doesn’t require much pre-processing and it automatically extracts the crucial features directly from the data. This paper presents the deep learning based feature extraction and machine learning based classification approach in infant cry classification and compares the various machine learning algorithms in infant cry classification. The audio signal of length 4 seconds data was converted into spectrogram image which was taken as input, deep convolutional neural network extract the features based on that the data were classified using various machine learning algorithms such as SVM, Naïve Bayes and KNN then compare their performance with Bayesian Hyper-parameter optimization technique. The experimental result shows that SVM provides better performance than K Nearest Neighbor and Naïve Bayes in the classification of infants hunger, pain and sleepy cries.
10:33 Computerized Decision Aid Applied to Meshless Method for the Use Case: Wave-Structure Interactions
Mohamed Loukili (Sciences Faculty Ben M’sik University Hassan II Casablanca, Morocco); Kamila Kotrasova (Technical University of Kosice, Kosice, Slovak Republic, Slovakia); Mustapha Mouhid (Hassan II University, Morocco)
In this paper, a meshless numerical model is proposed to simulate the propagation of the linear free surface water waves, using the fundamental solution of the Laplace equation as the radial basis function. Further, a spatial parameter is introduced to be free from the ill-conditioning effect without resorting to the use of source points. We are interested in the interaction mechanisms of a rectangular obstacle placed at the bottom of the numerical wave tank (NWT) in the presence of the waves in order to provide details on the attenuation process, and validate the numerical tool that we have established for the treatment of this issue. A comparison of the numerical and experimental outcomes is presented and is the subject of our discussion of the above targets.
S1D: Decision Aid in Accounting and Auditing 1
9:40 Board of Directors and Intellectual Capital Disclosure: Evidence from GCC Countries
This study examines the firm board of directors’ characteristics influences on intellectual capital disclosure (ICD) in the Gulf Cooperation Council (GCC) listed firms’ annual reports. By using content analysis and the ordinary least-square regression method it was found that the board size, meeting, and experience are drivers for the level of intellectual capital disclosure. The results of this study are undoubted of great concern to regulatory bodies in setting guidelines and procedures to encourage the disclosure of intellectual capital in annual reports. This paper’s contribution to the literature is as follows: it examines the relationship between board characteristics and the ICD for six years (2014-2019) and provides new evidence on the importance of the corporate governance variables and their impact on the firm’s ICD in emerging stock markets.
10:00 Sentiment Analysis of Banks’ Annual Reports and Bank Features: LASSO Approach
The current study investigates the effect of bank features on disclosure tones that have been classified into “positive” “negative” and “net” through 630 annual reports for a sample of 63 listed conventional banks from eight emerging markets namely, Egypt”, “Jordan”, “Qatar”, “Oman”, “Saudi Arabia”, “Kuwait”, “Bahrain” and “United Arab Emirates” covering the period of ten years (2008-2018). To achieve this objective, LASSO regression has been employed as a unique technique to design three models (positive- negative- net). Surprisingly, the results of these models are similar. Our results revealed that bank performance and beta of the bank are statistically significant and have a positive association with tones (positive, negative and net) in the three models. In contrast, the variables, bank size, leverage and market to book ratio are not statistically significant in all models. Such results have suggested that bank managers with good financial performance and stock volatility tend to send signals to many stakeholders in order to distinguish themselves from others. Other banks with a good financial performance may behave differently. They may use negative tone in their reports as a strategy that can help them to avoid blame in the future if they suffer losses.
10:20 Corporate Governance Mechanisms and Firms’ Dividend Payout Policies: Evidence from Bahrain
This study aims to examine the impact of corporate governance (CG) mechanisms on the dividend payout policies of Bahraini listed companies. By using multivariate regression model, the results of this study show a significant relationship between the firms’ dividend payout ratio and five CG mechanisms (i e. board of directors’ meetings, the board of directors’ size, number of board of directors’ committees, audit committee’s meetings and number of independent audit committee members). On the other hand, the relationship between the firm’s dividend payout ratio and the remaining two CG mechanisms (i e. audit committee size and independent audit committee chairmen) have been found to be insignificant. The findings of the study will help the shareholders, lenders, and policymakers in Bahrain to develop sound dividend payout policies that will contribute to the increase of the investors’ returns, protect the lenders’ interest and reduce the agency cost. Also, the study will contribute to the ongoing debate on the impact of CG mechanisms on the firm’s dividend payout policies in Bahrain by analyzing the data from the country’s stock market.
10:40 Environmental, Social, and Governance (ESG) disclosure and firm performance: Evidence from GCC Banking sector
Sustainability reporting allows companies to reflect their impact on a range of sustainability issues and to raise the level of transparency required by all stakeholders. Environmental, Social, and Governance (ESG) are the main components of sustainability reporting. This study aims to examine the impact of ESG disclosure on the banks’ performance. This study investigates 26 listed Banks in GCC countries for the period 2016-2019. This study uses return on equity (ROE) and return on assets (ROA) to measure banks’ performance. The empirical results show that the impact of sustainability reporting is significantly negative on banks’ performance.
S1E: Statistical Decision Making 1
9:40 Application of Fuzzy TOPSIS Algorithm for selecting best family car
Arijit Ghosh (St. Xavier’s College (Autonomous) Kolkata, India)
In this paper we demonstrated an application of Fuzzy set theory for Order of Preference by Similarity to Ideal Solution (FTOPSIS) methodology for the selection of best family car. Using a real-life data, we solved the problem. Imprecise linguistic qualitative preferences are converted into fuzzy numbers for representation and subsequent solution.
10:00 A Discrete Analogue of the Half-Logistic Distribution
In lifetime modeling, the observed measurements are usually discrete in nature, because the values are measured to only a finite number of decimal places and cannot really constitute all points in a continuum. For example, the survival time of a cancer patient can be measured as the number of months he/she survives. Then, even if the lifetime (of a patient, a device, etc.) is intrinsically continuous, it is reasonable to consider its observations as coming from a discretized distribution generated from an underlying continuous model. In this work, a discrete random distribution, supported on the non-negative integers, is obtained from the continuous half-logistic distribution by using a well-established discretization technique, which preserves the functional form of the survival function. Its main statistical properties are explored, with a special focus on the shape of the probability mass function and the determination of the first two moments; we discuss and compare, both theoretically and empirically, two different methods for estimating its unique parameter. This discrete random distribution can be used for modeling data exhibiting excess of zeros and over-dispersion, which are features often met in the insurance and ecology fields: an example of application is illustrated. An extension of this discrete distribution is finally suggested, by considering the generalized half-logistic distribution, which introduces a second shape parameter allowing for greater flexibility.
10:20 The Application of TOPSIS in the Selection of Statistical Prediction Model: A Forensic Ink Analysis Case Study
Forensic ink analysis is aiming to determine source of an unknown ink entry found on a questioned document. A forgery indicator is established if the source of a questioned ink entry is different from the other ink entries on the same document. A statistical predictive model built from a chemical data of pen ink is useful in accomplishing the goal. Multiple statistical prediction model could be constructed from a chemical data after it was preprocessed via multiple data preprocessing techniques. Commonly, hypothesis test is employed to choose the best prediction model. However, when the number of models and model performance metric are overwhelming, this can be a laborious task. This work aims to explore the feasibility of Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) for selecting the best predictive model. A total of 40 models have been constructed using an attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectral data of blue gel pen inks via Partial Least Squares-Discriminant Analysis (PLS-DA) technique. The ATR-FTIR spectral data was preprocessed using nine different data preprocessing techniques by considering the global and three different sub-spectral regions. The results show that TOPSIS produced similar ranking model performances to that derived via conventional approach, i.e. univariate hypothesis test. The advantages of TOPSIS method in this context are no statistical assumptions compliance, e.g. normality; and numerical score presenting together with a rank allows inference of relative difference between the models.
10:40 Solving Stochastic Linear Quadratic Games in Discrete Time with Two Players Using Exact Line-Double Newton Method
in this paper we consider a two-player stochastic linear quadratic differential games with an infinite horizon in discrete time. We assumed that there is no cooperation between the two players. For the given system, the major problem is solving a pair of stochastic discrete algebraic Riccati equation (SDAR) arising on the given system and its quadratic regulator to find its optimal control form. Thus, we construct a numerical method for solving SDAR using a modified Newton method. This method is modified from its original Newton method and Exact line search method to concise the Newton iteration. We provide a numerical simulation to show the performance of the method.
S1F: Management Decision 1
9:40 Micromanagement’s impact on banks’ performance
Debates about whether micromanagement leadership style is always harmful for the organization or can sometimes yield positive outcomes are widely conducted nowadays. This article investigates the impact of the micromanagement on the banks’ performance by examining the perception of banks’ managers and employees toward such leadership style. Data were primarily collected from interviews with 10 bank’s managers to give weight for each variable than from surveys distributed to 228 employees that occupy different positions in the bank. Cronbach Alfa test (0.769) was used to justify the internal reliability of the survey. The outcomes are represented through descriptive analysis statistics and the significance of the developed hypotheses was tested using Chi-Square and Spearman’s correlation analysis test. Findings revealed that although the majority of the micromanagement traits used are negatively affecting the bank, still, there are some that don’t cause any harm proving that every work environment needs an adjusted leadership style that best fits into it.
10:00 Proposed metamodels transformation from Predictive methodologies to Agile methodologies
Ibrahim Hamzane (Hassan II University, Morocco); Allae Erraissi (Hassan II University & Faculty of Sciences, Morocco); Mouad Banane (University Hassan II, Morocco); Abdessamad Belangour (Hassan II University, Morocco)
Project management models are constantly evolving, among the most widely answered models we find Agile models or else called “Adaptive” and “Predictive” models. A study by the Standish group confirms that Agile projects achieve the desired result three times more than projects that are carried out by conventional (predictive) Methodologies. Thus, many organizations tend to apply the new project management model or migrate from the standard model to the agile one. In this article, building on previous research, we will build on the principles of MDA to define a metamodel of Agile and Predictive Methodologies, and then we will affect a transformation that will help organizations transform project management.
10:20 Guidelines of Influencer Intelligence: Positive & Negative Impact of Influencer To Community
An influencer can be a netizen who influences society, be it positive or negative effects, depending on what they say on social media. Therefore, a guide is needed to determine the types of influencers and what effect they have when an influencer shares information, say something, and provide comments so that people will know what kind of influencer they are facing. This research is the development of previous studies, as listed in the introduction. The survey has conducted on thirty people covering the general public, a college student, and lecturers to determine influencers’ influence on social media. This study’s results are a guide for social media users to find out the types of influencers. What effects they have when dealing with types A, B, C, and D? By knowing these types, social media users will be wiser in responding to everything on social media. The method used to develop this research is the Johari window, a method with four flexible perspectives applied to this research. Furthermore, this research will produce innovations in communicating on social media. Social media users will be wiser in giving comments and criticism so that a more positive social media culture can be created and can make a positive contribution to society.
10:40 Optimal engine technology mix in a low carbon economy
Environmental regulations force car manufacturers to renew the powertrain technology portfolio offered to the customer to comply with greenhouse gas (GHG) emission targets. Automotive companies, in turn, are faced with the decision of finding the right powertrain technology portfolio consisting of, e.g., internal combustion engines and electric vehicles, because the selection of a particular powertrain technology portfolio affects different company targets simultaneously. What makes this decision even more challenging is the fact that future market shares of the different technologies are uncertain. With its multiple objectives, this real-world application requires multi-criteria decision-making techniques to identify the optimal powertrain technology portfolio. Our research presents a new decision support approach for assembling optimal powertrain technology portfolios while making decision-makers aware of the trade-offs between the achievable market share, the market share risk, and the GHG emissions generated by the selected vehicle fleet. The proposed approach combines ‘a posteriori’ decision-making, multi-objective optimization, and the Markowitz portfolio theory. In an application case, we feed the outlooks of selected market studies into the proposed decision support system. The result is a visualization and analysis of the current real-world decision-making problem faced by many automotive companies.
S1G: Decision Making Using IoT and ML 1
9:40 Towards Real-Time Homogeneity and Heterogeneity in Student’s Beliefs
Finding a similarity in the beliefs of students about trending technology is a challenging task. This paper used cluster analysis to groups the homogeneous beliefs of students concerning the technology provided. For this, we applied the Hierarchical Clustering (HC) approach to primary samples. We used the Agglomerative approach to cluster formation with Ward’s method and Squared Euclidean Distance (SED). This technique recommended a maximum of three and a minimum of two optimal clusters. Having three clusters 50% and having two clusters, 100% response beliefs are covered. Automatic detection using the HC method discovered the majority of students’ beliefs are positive about use, benefits, outlook, and growth. This paper presented an initial cluster approach to detect the Homogeneity and Heterogeneity of students’ responses about the technology. It might help to university management to see the grouping of identical beliefs. This technique can deploy online to frame a new web clustering module of the university.
10:00 Development of a Voice Chatbot for Payment Using Amazon Lex Service with Eyowo as the Payment Platform
Engaging in financial transactions has remained a hassle for the visually impaired due to the lack of technological products to facilitate their financial independence and inclusion. Automated teller machines (ATMs) and online banking applications do not provide any means through which the blind can engage in transactions without the need of a third party in managing their finances. This study aims at building a voice chatbot device that can be used for payment using Amazon Lex Service with Eyowo as the payment platform. The chatbot is built by leveraging on Artificial Intelligence (AI) technologies in the form of a service called Amazon Lex for configuring the bot with utterances and responses and Lambda Functions to validate the responses while carrying out the transactions by calling the Eyowo Application Programming Interface (API). A Raspberry pi single board computer is utilised as the medium of communication between the user and the chatbot. The Raspberry pi runs a script that collects voice input through a Universal Serial Bus (USB) microphone connected to it, which is sent to the Amazon Lex to be processed using Automatic Speech Recognition (ASR) and Natural Language Understanding (NLU). Then the chatbot sends back a suitable response to the user through the speaker connected to the Raspberry pi. The device brings comfort and security to the visually impaired by providing balance and cash transfers.
10:20 Data Analytical Framework for Internet of Things
Internet of Things (IoT) is characterized by a colossal scale of smart objects that collaborate impeccably with each other through a worldwide network. Data obtained from these smart objects can be used to recognize, inspect and manage complicated environments around us, enabling better understanding, intelligent decision-making, and performance optimisation. Data analytics plays an incredible part in creating efficient IoT applications. It is used to dig out consequential insights from IoT data and these insights are typically in the form of smart management decisions, trends, and statistics that assist IoT applications in making potent decisions. Hence, exploitation of data analytics in IoT applications provides tremendous benefits including enhanced quality, increased efficiency, automation and better decision-making. This paper provides a knowhow of data analytics in IoT and presents the benefits of data analytics in IoT applications. Moreover, the data analytical frameworks for delay-tolerant and delay-critical applications are also presented.
10:40 Time series forecasting of hourly water consumption with combinations of deterministic and learning models in the context of a tertiary building
The search for a model to provide an accurate prediction of water consumption is one of the major challenges in water supply systems. Auto-Regressive Integrated Moving Average (ARIMA) with and without seasonality combined with an Artificial Neural Networks (ANN) represent one of the most popular hybrid models for time-series forecasting. Actually, these models have recently demonstrated success in water consumption forecast. However, each rely on assumptions and show some limitations. So, this study proposes several new hybrid models that combine the ARIMA with seasonality i.e., SARIMA, neural approaches like the Long Short-Term Memory (LSTM) or the Multi Layer Perceptron (MLP), and a deterministic model based on a time function. These different hybrid models that combine individual models are used to predict hourly water consumption of a tertiary building. The experiments show that the resulting hybrid model with the time series deterministic model, the ANN and the SARIMA is efficient by improving the accuracy of the water consumption prediction. Indeed, this hybrid model allows to reduce the error of 8.24% and 5.53% in the mean of training and testing errors respectively, compared to all other individual or combined models.
S1H: Sustainable Decisions for a Sustainable Development 1
9:40 Responsibility Social Monitoring as the Sustainable Development Basis
The new type of economy formation provides for an increasing role of intellectual and social capital in management structures at the macro and micro levels, social entrepreneurship is becoming a strategic factor in the sustainable development of national economies. The entrepreneurship development problems analysis from the social responsibility point of view is important both from the theoretical point of view – finding out the social and economic nature of social entrepreneurship responsibility, and from the practical point of view – the social entrepreneurship development as a sustainable development factor. Formation of the national entrepreneurship social responsibility model as a sustainable development factor requires deep theoretical reflection, methodological formalization and justification of responsibility level monitoring methods. The purpose of the study is to substantiate the methodological basis for entrepreneurship social responsibility analysis as a condition for monitoring Ukraine sustainable development.
10:00 Quantification of social impacts on workers to aid decision-making in micro and small enterprises
Diego Alexis Ramos Huarachi and Fabio Puglieri (Universidade Tecnológica Federal do Paraná, Brazil); Julio Abraham Ramos Quispe (Universidad Nacional de San Agustín de Arequipa, Peru); Antonio Francisco (UTFPR – Universidade Tecnológica Federal do Paraná – Campus Ponta Grossa, Brazil)
Micro and Small Enterprises (MSEs) are a great source of social impacts and are more vulnerable to have worse social performance on their workers than bigger firms. Thus, it is important to aid decision-making to optimize the social impacts of MSEs on their workers. Hence, the purpose of this study is to apply a quantification framework to the qualitative results of a previous social life cycle assessment study applied in MSEs and to prove whether the quantification of the social impacts on their workers can aid to decision-making in this type of organizations. The method follows a four-step procedure and uses customized score factors for quantifying the social impacts on workers and was applied in three MSEs from three service activities. The results show several social hotspots that impacts negatively on MSEs’ workers (Freedom of association, Fair salary, Working hours, and Social benefits), thus, some actions, such as the encouragement of association of workers to labor unions, the increasing of salaries, the reduction of working hours and the provision of more social benefits, are suggested to enhance the social impacts on MSEs’ workers. So, the quantified results of social life cycle assessment allow to recognize social hotspots, they are a guidance to realize in what issues MSEs are impacting negatively on their workers and, thus, it aids to the decision-making, and encourages MSE-owners to take actions on it, in order to be more responsible with their workers and more socially sustainable.
10:20 Modeling evapotranspiration using Encoder-Decoder Model
Khadijeh Alibabaei (University of Beira Interior, Portugal); Pedro D Gaspar (University of Beira Interior & C-MAST – Center Mechanical and Aerospace Science and Technologies, Portugal); Tânia M. Lima (University of Beira Interior & C-MAST – Centre for Mechanical and Aerospace Science and Technologies, Portugal)
In recent years, deep learning algorithms successfully applied to develop decision support systems in fields such as medical, bibliometric, and many more. The field of agriculture is one of the most important where developing decision support systems using deep learning needs to be explored due to its direct impact on human well-being. Agriculture is the largest customer of freshwater. Hence, optimizing irrigation and developing an efficient irrigation decision support systems is essential. The amount of evapotranspiration is the most critical factor in scheduling irrigation. In this work, the ability of an Encoder-Decoder Long Short-Term Memory model to model the daily evapotranspiration was investigated for three locations in Portugal. The model predicts evapotranspiration three days into the future, given six days of historical data. Two models were trained; one is used the teacher forcing method, and the other one without the teacher forcing method. Our results show that by using the teacher forcing method, the model’s performance could be improved. The performance of the Encoder-Decoder Long Short-Term Memory was compared to a simple Long Short-Term Memory model, Convolutional Neural Network, a Multi-Layer Perceptions model, and Random Forest regression. The Encoder-Decoder Long-short term memory performed better on the test set.
10:40 A Non-Isolated DC-DC Boost Converter with High Gain Ability for Renewable Energy Sources Applications
In this paper a non-isolated dc-dc boost converter is described having high voltage gain ability. The proposed high gain boost converter, unlike the conventional boost converter offers high voltage conversion at low duty ratio, thus have high comparative efficiency. The proposed dc-dc boost converter is based on the voltage lift (VL) technique, in which input voltage is boosted in step-by-step manner. A detailed analysis of proposed boost dc-dc converter for non-ideal model is presented. The performance of proposed converter is compared with the conventional boost converter in term of different parameters, viz. number of elements, voltage gain ratio, and switch stress. The proposed dc-dc converter has advantages over conventional one and is suitable for renewable energy resources (RES) applications, especially solar and wind energy. The results obtained both for proposed and conventional converters in MATLAB ® /Simulink environment is validated through the lab scaled developed prototype.
S1I: Decision Models in Human Resource Management 1
9:40 Attracting Gen Z to Small and Medium Enterprises (SMEs):A view through the Job Characteristic Model
Theresa C.F. Ho (Universiti Teknologi Malaysia, Malaysia); Ling Suan Choo (University of Bahrain, Bahrain); Poh-Chuin Teo (Universiti Teknologi Malaysia, Malaysia); Narentheren Kaliappen (Universiti Utara Malaysia, Malaysia)
Generation Z will make up over 20% of the workforce by 2021, representing a considerable portion of the labour market. Unfortunately recruiting generation Z could still be a challenge for Small and Medium Enterprises (SMEs) as Generation Z prefer to work in a large and established firm, public sector, global multinational. This phenomenon would pose an impact on the ‘country’s economy as SMEs are the driving force most economies in the world. The objective of this paper is to propose a conceptual framework in redesigning a job that suits Gen Z through the lens of Job Characteristic Model (JCM). It is hoped that it will assist SMEs to recruit and retain future workforce.
10:00 Investigating the linkage between Quality of work life and Burnout in Indian IT Industry
In the global competition, the asset for the organization which impels them towards the success is considered as Quality of work life. The other construct “Burnout” has received much research attention in current decade. The aim of study is to investigate the linkage among Quality of work life and Burnout in Indian IT organizations. To investigate this, a study was carried out on a sample of 350 managers employed in IT organizations in NCR region, India. The study examined the objective through factor analysis, Pearson’s correlation ‘r’ and step-wise multiple regression analysis to find the impact of quality of work life on burnout. Findings points out that quality of work life have a significant negative relationship with burnout. Researchers could use the results of the study to categorize which aspect of quality of work life influence burnout most as well as put efforts towards incorporating those dimensions to reduce burnout among employees in the organization of IT industry.
10:20 Designing the Concept of Leaderships Intelligence (CI2.1) Version 2.0 inside Social Media Using Ken Watanabe Problem Solving 101 Methods
Leadership and social media are two things that cannot be separated from each other because this can have a significant impact on changes in culture and habits in society. This research found several problems such as misunderstanding communication; the information delivered is out of context and content, with no solution in contact but only argumentative and debate. It makes an environment on social media is not favourable. Moreover, this research is a development from: “Developing Leadership Intelligence (CI2) Framework in Social Media to Develop an Ethical Leader using the Johari Window Method”, which has categorized leadership in social into three types of leaders. This study will complete so that it becomes six categories. Thus, various kinds of social media leaders discuss. Furthermore, the results of this study are social media leadership frameworks, where this framework will cover: how a social media leader can have high quality in general knowledge and specific knowledge, how to communicate effectively and efficiently, how to overcome negative things on social media, and how to have a positive influence and produce a positive culture on social media
10:40 Methodology of human factor influence on complex safety of enterprises
Evgeny Vladimirovich Gvozdev (Moscow state University of civil, Russia)
In the management structure of industrial enterprises, it is required to create an integrated security system that includes industry-specific areas (industrial and fire safety, labor protection, environmental and information security, anti-terrorist security of the facility). To solve this problem, a methodological rationale is presented in obtaining indicators of the influence (impact) of each of the services, which makes it possible to identify problem points in the management of the complex security system under consideration. The approaches using existing methods in the complex security of enterprises are analyzed, the features of their application are considered. At the stage of analytical research, it is proposed to use the method of direct deterministic factor analysis, with the help of which it becomes possible to detail the indicators of the influence of factors, to break them down into components. At the stage of synthesizing the obtained analytical results, it is proposed to use the goal tree method based on inverse calculations, which makes it possible to obtain the coefficients of increment (decreasing) of gaps (erroneous actions) of service personnel arising in the dynamic process of fulfilling their labor relations. An example of solving an inverse problem based on the construction of a tree of goals is presented, which is characterized by ease of use, clarity, dynamism, versatility and uniformity
S1A: Decision Aid in Logistics and Engineering 1
9:40 A memetic metaheuristics search algorithm for load frequency regulation in multi area power system
Ardhala Bala Krishna and Sobhit Saxena (Lovely Professional University, India); Vikram Kamboj (Lovely Professional University, Jalandhar, Punjab, India); Chaman Verma (ELTE Informatikai Kar & Eotvos Lorand Tudomanyegyetem Informatikai Kar, Hungary)
Demand of electrical power is rising day by day. This rise of demand has leads to an imbalance between the loads of supply and requisition. In order to obtain the optimum solution for load frequency regulation of multi-area power system, a sophisticated meta-heuristics search algorithm is required. In the proposed research a novel hybrid memetic meta-heuristics search optimization algorithm has been developed to solve load frequency regulation problem of multi-area power system. The proposed hybrid algorithm has been developed using slime mould algorithm and pattern search algorithm. The outcome of the suggested optimizer was tested for standard benchmark functions and analytical results were evaluated with other standard meta-heuristics search optimization algorithms.
10:00 Fleet Management Optimization in Car Rental Industry: Decision Aid Models for Logistics Improvement
Carlos Fernandez, Antonio A Freitas and António Morais (University of Beira Interior, Portugal); Tânia M. Lima (University of Beira Interior & C-MAST – Centre for Mechanical and Aerospace Science and Technologies, Portugal); Pedro D Gaspar (University of Beira Interior & C-MAST – Center Mechanical and Aerospace Science and Technologies, Portugal)
This paper describes a study case based on two different approaches to optimize the logistics performance in a rent-a-car company, leading to the improvement of the vehicle’s delivery process, reducing the transport costs, and subsequently increasing the customer’s satisfaction. The first approach involves the shortest path determination, using the Dijkstra algorithm applied in a parking lot layout in two scenarios named general and specific approaches. The second is related to the max flow determination, by applying and comparing the Ford-Fulkerson and the Goldberg algorithms. Both approaches provided models which were simulated and analysed in MATLAB as a global decision aid model.
10:20 Logistics strategy (FIFO, FEFO or LSFO) decision support system for perishable food products
Adriana Mendes, João Cruz and Tiago Saraiva (University of Beira Interior, Portugal); Tânia M. Lima (University of Beira Interior & C-MAST – Centre for Mechanical and Aerospace Science and Technologies, Portugal); Pedro D Gaspar (University of Beira Interior & C-MAST – Center Mechanical and Aerospace Science and Technologies, Portugal)
Food waste is one of the major challenges to be faced in the world and, therefore, its elimination is one of the most important issues in the perishable product market. Thus, it is necessary to develop and/or apply algorithms and mathematical tools that relate food products perishability with their cost, i.e, how the value of a product varies as function of its deterioration over time until spoilage. This decision support system aims to assist in the product’s price definition during its shelf life, maximizing the profit and reducing the spoilage and consequently waste. The proposed easy and expedite computational decision support system compares the FIFO, FEFO or LSFO logistics strategies. The system is tested for a test case scenario of a specific perishable product, yogurt, where it will be adjusted to its characteristics.
10:40 Multi-Criteria Decision Making Methods and Project Delivery Approaches
The proposed study investigates the differences in project owner (Employer) decision outcomes for selecting the most suitable project delivery approach, using two Multi-Criteria Decision Making (MCDM) Methods, the Analytic Hierarchy Process (AHP) and the Technique of Order Preference by Similarity to Ideal Solution (TOPSIS). The study considers three primary project delivery approaches: Design and Build (DB), Traditional Design Bid Build (DBB), and Construction Management. The factors affecting owners’ delivery preferences were explored in the literature and data was collected from project owners across the United Arab Emirates (UAE). The analyses were carried out via AHP, and TOPSIS on the three delivery methods. The results from the two MCDMs were compared to find the differences in decision outcomes related to delivery type selection. The analysis showed that both MCDMs yield the same results. Hence, both methods are applicable to use as a basis for decision making regarding the appropriate project delivery method and based on typical owners’ preferences.
S2A: Decision Aid in Logistics and Engineering 2
11:10 Design and Analysis of a Closed-Loop Temperature Engineering Control System using MikroC and Proteus
Aaron Don M. Africa and Darlene Alyssa P. Abaluna (De La Salle University Manila, Philippines); Ara Jyllian Abello (De La Salle University, Philippines); Joaquin Miguel B. Lalusin (De La Salle University Manila, Philippines)
Throughout the years, automation has gotten better and better. Many industries have shifted to automated systems to make various corporate processes much more efficient. On a fundamental level, automation is a control system that varies in implementation. Automation can allow devices to operate by themselves without the need for human intervention. There are many different applications to control systems, but this paper focuses on the design and analysis of a closed-loop control system-based temperature controller. The group will write a script using mikroC PRO, this script will then be programmed to a PIC18F45K22 microcontroller using Proteus, where the circuit will also be designed and simulated. The group will test whether the design is functional and whether it can control the temperature based not only on the user’s desired temperature, but also feedback from sensors indicating the ambient room temperature. This paper can serve as a basis for future design revisions with different styles of implementation and application.
11:30 A Low cost Augmented Reality system for Wide Area Indoor Navigation
Vivek Dosaya (M. S Ramaiah Institute Of Technology, India); Shashwat Varshney (M S Ramaiah Institute Of Technology, India); Vijaya Kumar Beekanahalli Parameshwarappa (MS Ramaiah Institute of Technology, Visvesvarai Technological University & AICTE, India); Akshay Beniwal and Shraddha Tak (M. S Ramaiah Institute Of Technology, India)
In today’s world, there are a lot of outdoor navigation apps for visually challenged people, but there are none that can precisely tell a user’s location inside a large structure. Indoor navigation is a complex task for the visually challenged as well as for the general public, especially in large structures like malls, airports, museums, factories, etc. Present solutions and technologies are not cost-effective as well as complex. Hence, we are proposing a low-cost model that uses Augmented Reality to place virtual anchors across a structure, so a person can navigate from one location to another with the help of these anchors. The model doesn’t use technologies like GPS. Machine Learning, and Artificial Intelligence but here, the anchors placed are pervasive and persistent across the indoor environment for smooth navigation. Once placed, these virtual anchors remain at their location and can be used at any time by any person registered on our app. This model can be extended for the general public in any indoor space and also can be enhanced by gamification for better user interaction and retention. This model can also be extended to collaborate with the Aarogya Setu app, which can help us identify routes that go through spaces through which covid positive patients have passed which in turn helps us avoid those routes in real-time navigation.
11:50 IIoT Benefits and Challenges with BlockChain
Amit Chaurasia (Jaypee University Solan, India)
The millions of people in the world will be using billions of devices in the future internet and they all are controlled by a centralized or decentralized system. The Industrial Internet of Things (IIoT) provides an era of huge information sharing, storage and processing. The limitations with IIoT are security and reliability of data. Blockchain technology has emerged as the key player to provide the solution to data reliability and security. The data sharing between IIoT-based distributed centres will be enormous in the upcoming internet future and will be the bottleneck for most of the industries implementing IIoT. In this paper, we focus on the relationship between blockchain-IIoT integration challenges and optimal solutions for the improvement of IIoT services using IoT boards comparison.
12:10 Development of Microstrip Patch Antenna Integrated on Solar Cell Based Artificial Magnetic Conductor Surface for Green Wireless Applications
Suresh babu Thandullu Naganathan (Eswari Engineering College, Chennai, Tamilnadu & Adhiparasakthi Engineering College, Melmaruvathur, Tamilnadu, India); Sivakumar Dhandapani (Eswari Engineering College, Chennai, Tamilnadu, India)
In this paper, a novel design for a plurality of wide band microstrip rectangular patch antenna on solar cell integrated Artificial Magnetic Conductor (AMC) surface for Wireless communication is proposed and presented. Here, the general reflection characteristic from an AMC surface with rectangular unit cells as a reflector and effect on integration of the solar cell with the proposed antenna is validated. The designed AMC plane consists of 6× 6 patches with both copper ground plane and also solar cell ground plane, be placed beneath the rectangular microstrip patch antenna maintaining an air gap between the structures, which achieves unidirectional radiation, and also enhance operating bandwidth simultaneously. The amorphous silicon solar cell in the module is used as both a photovoltaic generator and also from an RF perspective; act as ground plane for the AMC plane. The effectiveness of the proposed antenna structure is confirmed by high-frequency structure simulator (HFSS) tool for 3D full wave electromagnetic field simulation and measurement. The fabricated patch antenna on AMC plane with copper ground plane offers 2.12 GHz – 2.48 GHz band and the same radiating patch on AMC plane with solar cell ground plane offers 2.16 GHz – 3.32 GHz with enhanced impedance bandwidth, effective return loss, good VSWR and radiation pattern characteristics for green WLAN applications.
S2B: Medical Decision Making 2
11:10 A comparative study of different feature extraction techniques for identifying COVID-19 patients using chest X-Rays images
Shrasti Vyas (PDPM Indian Institute of Information Technology, Design and Manufacturing, Jabalpur, India); Ayan Seal (PDPM Indian Institute of Information Technology, Design and Manufacturing Jabalpur, India)
The coronavirus (COVID-19) outbreak has been labeled as a pandemic with no assured vaccine and drug till now. Many medical trials are going on for finding treatment against this disease and some have achieved success but reaching out to all stakeholders is strenuous. A quick and proper identification through testing of a COVID-19 patient is equally important so as to prevent the spread of the virus to other healthy patients. Thus, a comparative study of different feature extraction techniques for identifying COVID-19 patients using chest X-Rays images is done in this work. The combinations of local binary patterns feature extraction technique and gradient boosting classifier performs the best with 94.453% accuracy as compared to other approaches. So, this work will be of great help in the early screening of COVID-19 and also contribute to the healthcare system to fight against coronavirus.
11:30 A Multi-criteria Scheme to Build Model Ensemble for Dengue Infection Case Estimation
The epidemic of dengue fever has long been a big health threat to populations in many tropical and subtropical countries ranging from Asia and Africa to Americas. Various statistical and intelligent computational methods have been adopted by researchers worldwide to build an accurate model for predicting the incidence of dengue infection. Early and precise prediction is one of the key factor toward effective control of the dengue fever spread. Most of current modeling techniques apply a single model to predict dengue spread. We propose a different strategy that prediction through voting from a group of powerful models should yield a more accurate result than traditional prediction made by a single model. However, appropriate number and type of models to be included in an ensemble are questionable issues. We introduce a heuristic method to select promising models based on their scores computed from a multi-criteria scheme. In particular, our model ranking method considers the three main criteria: prediction error made by the model, correlation between target and predictors of the model, and size of the model. To avoid overfitting problem when applying an ensemble to predict the future unseen data, we select models from both the top and bottom ranks. Experimentation on the out-of-sample data confirms the prediction accuracy of our ensemble scheme.
11:50 A review on Diagnosis and Treatment methods for coronavirus disease with sensors
Ivan Miguel Pires (Instituto de Telecomunicações, Universidade da Beira Interior, Portugal)
Due to the dispersion of the pandemic situation related to coronavirus, different scientific studies were performed to stop the evolution of this virus over the world. It is a new virus, and no treatment was already developed for the procedure. It consists of a large family of viruses that cause illness ranging from the common cold to more severe diseases, such as Middle East Respiratory Syndrome (MERS-CoV) and Severe Acute Respiratory Syndrome (SARS-CoV). However, the most common problems are related to SARS-CoV. This paper intends to study the evolution of the research studies using different types of sensors for the treatment and diagnosis of this virus. It was verified that it is a subject widely researched, but only a small set of studies are using the sensors and technological equipments.
12:10 Improving Efficiency of Self-care Classification Using PCA and Decision Tree Algorithm
Muhammad Syafrudin, Ganjar Alfian and Norma Latif Latif Fitriyani (Dongguk University, Korea (South)); Abdul Hafidh Sidiq (Gunadarma University, Indonesia); Tjahjanto Tjahjanto (Universitas Budi Luhur, Indonesia); Jongtae Rhee (Dongguk University, Korea (South))
Self-care classification for children with physical disability remains an important and challenging issue. It needs the support from occupational therapists to make decision. Data-driven decision making have been widely adopted to make decision based on the data with help of expert systems and machine learning algorithms. In this study, we developed an efficient self-care classification model based principal component analysis (PCA) and decision tree (DT). PCA is used to extract the significant features, while the DT is used to build the classification model. We measure several metrics to evaluate the performance of proposed model as compared to other models and previous study results. Based on 10-fold cross-validation results, the proposed model outperformed other models and previous study results by achieving accuracy of 94.29%. Furthermore, PCA-based feature extraction has shown positive result on improving the model’s performance with average accuracy improvement as much as 1.7% as compared to classifiers without PCA-based feature extraction method. Finally, it is projected that the outcomes of the study could assist the occupational therapist on enlightening the efficiency of self-care classification and children therapy.
S2C: Computerized Decision Aid 2
11:10 Attendance Management System using Facial Recognition
Ankush Kumar (Indian Institute of Information Technology Bhagalpur, India); Manish Sharma (Indian Institute Of Information Technology Bhagalpur, India); Saurabh Pratap Gautam, Ranjeet Kumar and Sandeep Raj (Indian Institute of Information Technology Bhagalpur, India)
In recent years, technologies related to Facial Recognition have undergone a remarkable upgradation in the domain of commerce as well as security. This paper presents an automated real-time attendance management system (AMS) using face recognition technique to reduce the human dependency and thereby saving the time. A modified local binary pattern histogram (MLBPH) algorithm based on calculation of based on pixel neighborhood gray median for extracting the significant features of the human face. More specifically, the facial landmarks are extracted to provide a completely unique result using MLBPH histogram. Further, the histogram of the input image is compared with database histogram using the classifier in the classification step. The human face matched with the database is used to mark attendance in the laboratory. The experiments reported a precision and recall of 97% and 95% respectively. This kind of biometric system is a real-time attendance system processing the human faces using simple and fast algorithms having higher accuracy can be deployed in schools.
11:30 Prediction of Success in Crowdfunding Platforms
Membership platforms serve as a source of constant earning to the independent creators who create media content such as images, videos, podcasts that are patronized by creators’ followers. This mechanism leads to build the platform where the patrons contribute to raise funds to promote the creator. Patreon is one of the largest membership-based platforms that crowdfunds the media content-based projects. Predicting the success of crowdfunding projects is equally important for projects’ creators and investors. In this research we resort to supervised machine learning techniques to provide decision-making supports for prediction of success or failure of such project. By comparing Naïve Bayes, Logistic regression and Random Forest classifiers we demonstrate that Random Forest classifier with an accuracy of 71.5% outperforms the other two classifiers in success prediction. The findings will help the creators to better decide on their projects and improve their fan/follower base using different social media platforms.
11:50 Building A New Blueprint for Operating Workflow Efficiently
Business processes are indicating how to handle and deal with different business situations. Companies focus on the flexibility of information technology architecture as their main business strategy for decision making. Their systems should satisfy the adaptability criteria. In this paper, we propose a framework to measure the quality of the existing workflow by using the quality of services criteria. Based on the results of quality of services measurements, an enhancement of the framework should be obtained using concepts of graph theory such as min cut and max cut algorithms and some re-factoring techniques. To reduce the bugs that could occur and to reach the long-life span for an organization workflow, we proposed, designed, and implemented a new mainframe that takes a workflow as an input to check whether it meets the specific quality of services measurements or not. If the workflow is not qualified, then it will be enhanced automatically using our proposed algorithms to meet the quality of services rules and checked for the consistency within the system. The main frameworks components are the quality measurement, determination, parsing, and code adaptation. We presented and discussed a real case study to help in understanding, illustrating and analyzing the proposed framework behavior and its approach applicability with analyzing different related scenarios.
12:10 Social interactions issues in group decision-making
Ayeley P. Tchangani (University Toulouse III, France)
Decision-making is certainly the most widespread of all human activities, whether individual or by a group. Some decisions, especially individual decisions, are easy to make and do not require sophisticated algorithms to arrive at a solution. Others, on the other hand, and especially in the case of group decision-making, require the establishment of frameworks, rules or algorithms of varying degrees of sophistication to arrive at a satisfactory solution. In this process, the most difficult part is certainly the modeling and treatment of the relationships between the actors in the decision-making group. The objective of this paper is therefore to build a framework for modeling and analyzing interactions between decision-makers in a group on computational bases in the sense that these interactions will be characterized by numerical parameters. The constant concern in this work is to get as close as possible to human behavior by using bipolar analysis.
S2D: Decision Aid in Accounting and Auditing 2
11:10 Decision Aid in Budgeting Systems for Small & Medium Enterprises
The practice of budgeting in Small & Medium Enterprises (SMEs) had been investigated in this study using exploratory research. Two research methods, a survey, and interviews were employed to answer the following objectives. What methods and types of budgeting do SMEs use? How the budgeting is carried out in the SMEs? The question of why do they adopt budgeting further taps on the underlying reasons for the lack of a comprehensive budgeting system and no budget adoption at all. The study presented four important findings. Incremental budgeting was identified as the common method and cash budget as the primary type of budgeting. The budgeting process was generally based on a traditional budgeting a command and control management model, where the budget was centralised and relied on bureaucratic control with a rigid hierarchy for allocation of funds and resources. Goal-setting, resource dependencies, and cognitive evaluation theoretical perspectives play a greater role in explaining the budget practice and the lack of budget adoption in SMEs. The study pointed to the SMEs and relevant agencies to give more attention to training and skills development, technological support, and the attitudinal and behavioural aspects in budgeting for the active use of budgeting in doing business.
11:30 Can Related Party Transactions Be a Matter For Firm Value? Evidence From Emerging Markets
The aim of this study is to investigate the effect of related party transactions on market value of the firm using a sample of listed firms from six emerging markets namely, Bahrain, Kingdom of Saudi Arabia, Pakistan, Kuwait, Jordan and United Arab Emirates. Our final sample consists of 261 firms with 1044 firm-year observations covering the period from 2015-2018. The study has classified related party transactions into five types namely, key management compensation, related party (due from-receivables), related party (due to-payables), related party (sales) and related party (purchases). The results show that two types of related party transactions namely, key management compensation and (Due to-payables) have a significant effect on firm’s market value. For first type, there is a significant negative effect on firm value while, second type has a significant positive effect. In contrast, the other three related party transactions (purchases, sales, and receivables) are not significantly associated with firm’s market value.
11:50 A GCC Evidence on the Effect of Board and Audit Committee Features on CSRD: the Case of Employee and Product Information
The key objective of this study is to examine the possible association between board and audit committee (AC) features on the disclosure extent of employee and product information (EPID) by listed firms in three Gulf Cooperation Council (GCC) countries, Bahrain, Kuwait and UAE. An EPID index containing 19 items has been applied to collect suitable information from sampled firms’ annual reports and Websites. A total of 255 firm-year observations through the three GCC countries (Bahrain, Kuwait, and UAE) was used over 3-year period (2017-2019). This study employed six independent variables representing board and AC features (Board independence, CEO duality, Board size, AC independence, AC number of meetings and AC size) after statistically controlling the effects of Firm type, Firm size, and External Auditor Quality. Regression findings reveal that only independent variable used in this research are not explaining the disclosure of employee information, however, they are significantly explaining the disclosure of product information. Further, AC size is the most essential variable determining EPID.
12:10 Corporate Governance, Ownership Structure and Investment Structure: Evidence from GCC Countries
The purpose of this study is to examine the impact of corporate governance mechanisms and ownership structure on the investment structure of the Gulf Cooperation Council (GCC) non-financial companies. The empirical results show that the concentration of shareholding, foreign shareholding, and governmental shareholding are significantly positively associated with the firm’s investment structure. This suggests that the higher the non-local ownership in GCC companies the higher the fixed assets investments, and, the higher the percent of governmental ownership the higher their investments in assets. Additionally, the results indicate that board independence, the presence of female directors, and the frequency of board meetings, are significantly associated with the firm’s investment structure. The results suggest that higher board independence would enhance the capability of the firm to increase its rate of investment in assets. Also, the results show that audit committee independence and the frequency of its meetings are significantly positively associated with investment structure.
S2E: Decision Aid in Logistics and Engineering 3
11:10 A fuzzy multicriteria approach for phosphogypsum waste valorization
Phosphogypsum (PG) is the waste product of phosphoric acid. Global PG production is estimated at around 160 Mt / year (AIEA, 2013). This enormous generated amount of PG produces a serious dilemma and costly management costs. This environmental issue has aroused interest in the scientific community, as evidenced by the increasing number of publications on the PG in recent years. Our work describes the Tunisian PG by-product, the storage method and the different recovery techniques studied around the world. In this work, we developed a new approach to help group decision-makers to deal with the complex PG valuation problems. This approach consists in using the Gray Rational Analysis method (GRA) to combine the decision matrices of decision makers in order to make a group decision, using the fuzzy hierarchical analysis process (Fuzzy AHP) to find the fuzzy weights criteria and contained by application the order of preference technique by similarity with fuzzy the ideal solution (Fuzzy TOPSIS) to find the best ranking of the alternatives studied. The results are accepted by the expert group, hoping for its real application.
11:30 Fuzzy decision support system for risk analysis in urban requalification project
Urban requalification projects in crowded areas are a challenge for urbanists. The planned benefits are expected in mid-term after the interventions are notorious. However, the intervention is annoying for most of the population who needs to live with the construction. In this paper, we propose a decision support system to evaluate the risk of an urban requalification project do not reach their objectives and do not improve the citizen’s well-being. To do so, it is used as a model that connects some famous processes in literature, as Copeland and Analytic Hierarchy Process with the as Picture fuzzy sets. This paper also presents an evaluation in a real case in of Conde da Boa Vista Avenue project, where the model is presented.
11:50 On Solving the On-Demand Grocery Delivery Problem
Nour El Houda Hammami (National Engineering School of Tunis, University Tunis El Manar, Tunisia); Safa Bhar Layeb (National Engineering School of Tunis, University of Tunis El Manar, Tunis & UR-OASIS : Optimization and Analysis of Service and Industrial Systems Laboratory, Tunisia); Amel Jaoua (LR-OASIS, National Engineering School of Tunis, University of Tunis El Manar, Tunisia)
With the rise of digitalization, customer expectations, and urbanization rates in cities, on-demand commerce is proving to be very promising, because of the flexibility offered to its consumers. This flexibility reveals many challenges regarding the delivery times claimed by customers. In this work, we address a new MILP model for the on-demand grocery delivery problem that assigns time windows starting from sets of time windows filled by customers when they order their groceries, while minimizing the total routing costs. The proposed model was firstly assessed on realistic data from a hypermarket in the northern suburbs of Tunis, Tunisia. The promising results reveal outlines for future work.
12:10 Logistic and Blockchain: A Strong Partnership
Economic growth is an important goal for governments. This growth is promoted by economic activities known to be a major emitter of greenhouses gases resulting in climate change. Most of those activities are related to logistics and supply chain sectors. Countries around the world implemented an initiative to control and reduce GHG especially CO2 under a monitory system to trade carbon emission allowances. However, the CET suffered from shortages that degraded its effectiveness. Also, it was not comprehensive enough to cover all sources of CO2 emissions. Many researchers proposed solutions to elevate the present CET scheme in which the technology of blockchain was utilized. Blockchain presented itself as an ideal solution for the current CET market due to its characteristics of transparency, immutability, security, and decentralization. However, non of the previous work was adequate or comprehensive enough. This research proposes a comprehensive blockchain-based framework to support and CET that integrates IoT devices and includes logistics and supply chain sectors. Our framework collects full data about CO2 emissions directly from all sources and shares them securely and immutably among stakeholders. It enhances making decisions on economic processes that reduce CO2 emission as well as trading allowances.
S2F: Management Decision 2
11:10 Data Analytics of Strategic Agility and Competitiveness in Operation performance: A case of Banking Sector in Saudi Arabia
Developing capabilities of agility, prompt responsiveness, and adaptation to environmental changes are crucial keys to gain a competitive advantage. Strategic agility is a developed concept of agility that pays attention to strategic matters and predicting changes in the environment before they come. Moreover, business analytics has brought compelling value to businesses, especially for the banking industry, as it is a sharp instrument for reflections, predictions, and decision-making. The purpose of this study is to explain and measure the impact of strategic agility supported by data analytics and competitive advantage on the operational performance of the retail-banking sector in Saudi Arabia. The type of research design adopted in this study is descriptive and inferential statistics. Additionally, for more efficiency and integration, the authors have synchronized the space of management with business analytics for more efficiency and business integration. The statistical population of this study consists of managers and experts working in the retail-banking sector in Saudi Arabia. A total of 140 managers and experts from the banks have formed the study sample, and the Pearson correlation test of the relationships between variables applied. The results indicate that strategic agility has had a significant impact on the competitive capabilities of the banking sector. Furthermore, among the variables of strategic agility, Price flexibility is the most influential factor in competitive skills.
11:30 Comprehensive Forecasting of Interconnected Socio-Economic Indicators as a Methodological Basis for Adopting Optimal Management
Oleksandr Yankovyi (Odesa State Economic University, Ukraine); Oleksii Hutsaliuk (International European University, Ukraine); Viktoriia Tomareva-Patlakhova (Classic Private University, Ukraine); Nataliia Shmatko (Kharkiv Polytechnic Institute, Ukraine); Olena Kabanova and Yuliia Rud (Classic Private University, Ukraine)
The article presents the theoretical achievements of technical cybernetics, in particular the principle of balancing variables. The essence of which in the final conclusion on the suitability of certain forecasts is determined by the degree of performance for the predicted values of interrelated variables of the balance ratio, fair to them in the prehistoric period. The principle of balance of variables and the criterion based on it plays the role of an external complement that carries new information about the studied process and can be used as a priority in comparison with the methods of choosing mathematical forms of trends of varying complexity. First of all, this applies to the criterion of maximizing the coefficient of determination, which is usually used in the selection of the best reference functions that approximate the isolated time series. It is proposed to use this principle and the criterion of balance of variables based on it when forecasting socio-economic indicators, between which there are additive and multiplicative links. The application of the criterion of balance of variables in determining the forecast of the trend of forecasting the number of births, deaths and natural population growth in Ukraine for 2020-2021 is illustrated. Despite the small sample, balanced results were obtained for short-term forecasting of Ukraine’s demographic indicators. They can serve as a reliable methodological basis for planning measures to improve the demographic situation in the country, as well as for making optimal management decisions on the use of human resources in the economy.
11:50 Optimization Model for Production Systems of Irrigation Improvement Projects Using Nonlinear Programming and Genetic Algorithms
Robert Antonio Romero-Flores (National University of the Altiplano, Peru)
This work includes the formulation of an optimization model for agricultural production systems under irrigation conditions in the region of Cusco – Peru, which is based on the evaluation of various analytical and heuristic methods; resulting the non-linear programming method as the most appropriate because it requires an objective function and restrictions, which is consistent with the effect of the indispensable restrictions for the proposed model, which arise from its interaction with the market under real conditions. From this interaction one of the most relevant variables is the sale price, for whose determination a new mathematical model is proposed; in which, fluctuations in supply and demand are considered. Therefore, the contribution of the new proposed optimization model is based mainly on the fact that it allows knowing the conditions assumed to maximize profit, minimize production costs so that losses that affect the economy of the agricultural producer can be anticipated. Likewise, the new optimization model allows the construction of scenarios by modifying variables and restrictions to know the behavior of the economy for various products. The results of the optimization model are consistent and comply with the Kuhn Tucker conditions, therefore, they are considered validated. Within this framework, the model was evaluated using genetic algorithms, with the result that the sexual selection operator is the one with the best performance for obtaining the optimum in non-linear problems under restrictions
12:10 Supporting Tools for Transition towards Industry 4.0: A Pressurized Cylinder Manufacturing Case Study
Md Mashum Billal, Miguel Baritto, S. M. Muntasir Nasim and Rumana Afroz Sultana (University of Alberta, Canada); Mohammad Arani (University of Arkansas at Little Rock, USA); Ahmed Qureshi (University of Alberta, Canada)
The main purpose of this work is to report an implemented novel methodology to support Small and Medium Enterprises (SMEs) managers in better understanding the specific requirements for the implementation of Industry 4.0 solutions and the derived benefits within their firms. The methodology was implemented in a pressurized cylinder manufacturing company as a case study. The cylinder losing, inadequate scrap management, and bottlenecking in body welding were identified as three of the main problems that could be addressed through Industry 4.0. Potential solutions were considered and found suitable solutions for the problems. For example, Radio-Frequency Identification (RFID) tool was proposed to prevent the illegal cross-filling and illegal cylinder swapping problems and it will help for cylinder identification. A rough techno-economic feasibility analysis was also done for the proposed solutions, which will be helpful for SMEs manager to decide regarding when and how to migrate Industry 4.0.
S2G: Decision Making Using IoT and ML 2
11:10 Microsoft Azure IoT-based Edge Computing for Smart Homes
In this work, an experimental Edge Computing (EC) system based on Microsoft Azure IoT has been developed for Smart Home Environment (SHE) as a case study. Several relevant issues have been tested such as device management patterns and some security concerns. The getting started issues of this technology have been emphasized. These include development machine compatibility, edge device choosing decisions, containers engine, and proper IDE for development. An emulated smart home scenario with five virtual sensor devices hosted by a Raspberry Pi edge device has been used as a case study to investigate the properties of the system. We have studied the results coming from applying the newly developed configuration management patterns to set or modify some environment variables remotely and inside the edge node. Furthermore, various operational cases have been studied to experimentally verify the difference when enabling edge analytics and when sending raw readings to the cloud. Indeed, the case of using symmetric encryption to increase the security aspects of the application has been studied and experimentally verified.
11:30 InteroEvery: Microservice Based Interoperable System
Badr El Khalyly (Hassan II University, Morocco); Allae Erraissi (Hassan II University & Faculty of Sciences, Morocco); Mouad Banane (University Hassan II, Morocco); Abdessamad Belangour (Hassan II University, Morocco)
The advent of Connected objects has been causing a significant technological revolution. Researchers, Designers, and developers have been mobilized to develop solutions that can reach final users in several domains. Solutions that can make the interaction between users and objects easier and handier. Therefore, so many efforts have been made to facilitate device control. A heavy synergy has been carried out as well to connect objects as a part of the Machine-to-Machine pattern. Fitting together objects have to take into consideration the heterogeneity of communication protocols. Brands and companies produce devices that enable the use of different protocols from physical to application layer. Each brand adopts certain protocols in its product that may differ from other brand’s products. This issue meets the challenge of interoperability. The building of Applications and solutions that can monitor objects follow many architectural styles. Nowadays, the trend of architectural style is microservices. The Deployment of these applications can be performed on three levels: Edge – Fog – Cloud. This deployment level mobilizes the integrators to create and innovate deployment and integration strategies. This paper will fit together these paradigms to create a solution that will meet the need for interoperability concerning the needs of final users.
11:50 Stacking-based GRNN-SGTM Ensemble Model for Prediction Tasks
Ivan Izonin, Roman Tkachenko, Pavlo Vitynskyi and Khrystyna Zub (Lviv Polytechnic National University, Ukraine); Pavlo Tkachenko (IT STEP University, Ukraine); Ivanna M Dronyuk (Lviv Polytechnic National University, Ukraine)
An effective solution of the prediction tasks requires the high accuracy of the result with minimal resource and time costs for the operation of the chosen algorithm. In cases when high accuracy of the received result has the first priority, it is expedient to use ensemble learning. This paper describes a prediction method using a new, stacking-based GRNN ensemble model. Each member of the developed ensemble processes its own dataset, where the vectors of the original set of data are randomly shifted relative to the current point. The authors chose SGTM neural-like structure as a meta-algorithm for the formation of the result of the ensemble. This choice is argued by the high accuracy and speed of its work. The results of a number of experimental studies on the optimal parameters selection of the developed ensemble are described. A comparison of the efficiency of its work with a number of known predictors was done. Prospects for further research are described.
12:10 Traffic-Lights-Based Guidance System in Lebanon using Network Optimization
Elie Khoderchah (University of Balamand, Lebanon); Ibrahim Jazzar (Lebanese University, Lebanon); Samer El-Zahab (University of Balamand, Lebanon); Nabil Semaan (University Of Balamand, Canada); Abobakr Al-Sakkaf (Concordia University, Canada)
Every country in the world faces major problems regarding civil issues. In Lebanon, one of the main problems the small country faces is related to road or traffic congestions. First of all, the lack of infrastructure is a major concern which directly impacts the routes people use daily. Second, the absence of 24-hour electricity is another direct factor impacting the safety and state of the main roads. Third, the entire country lacks proper funding to restore or even improve public transportation which could lead to a decrease in overall car usage. Finally, 80% of the roads present nowadays lack satellite coordination coding, meaning that the use of a GPS such as “Google Maps” is not always accurate nor reliable. All and all, the transportation sector needs to solve many problems, therefore, this project will provide a solution to one of the main problems which are the poor lighting on the streets and the tracking system. The main reason for providing a solution to this problem is the alarming number of car crashes due to dark and unsafe roads which lead to the death of many Lebanese citizens. Not only that, but dark roads also have an indirect threat to drivers at night by forcing opposite lanes of the highway to be filled with cars having their high beam headlights on which could cause a blurry vision for the opposing driver and in extreme cases causing a crash. This project will provide an automatic street lighting system which charges during the day and lights the streets at night whenever sensors detect the presence of a car and also guides cars on many roads where traditional GPS tracking systems fail.
S2H: Sustainable Decisions for a Sustainable Development 2
11:10 A State-of-the-Art Review on xEVs and Charging Infrastructure
Mohd Rizwan Khalid (ALIGARH, India)
The aim of this paper is to introduce the new emerging technology of x-electric vehicles (xEVs); where x is a general term which stands for hybrid/plug-in/battery e.t.c. xEVs have emerged as an alternate for internal combustion (IC) engine based vehicles because of their lower dependence on fossil fuel, thus causing less environmental pollution (mainly CO and CO2) and hence plays an important role in saving the earth from global warming. Moreover, xEVs are far more efficient than IC engine based vehicles. Thus, it is a wise option to switch for xEVs to IC engine based vehicles. In this paper the xEVs technology is discussed in detail which includes their charging infrastructures, charging power levels and energy storage systems. Moreover, the hurdles in the way to success of this technology are discussed and the most efficient solution to them are suggested which are easy to implement and do not need large investments.
11:30 Decision making to calculate economic sustainability index: A case study
This work assesses the economic sustainability of industrial factories in Egypt based on the Analytical Hierarchical Process (AHP). The authors designed an economic sustainability index calculator (ESIC) using Unified Modelling Language (UML) and implemented the software using Visual basic of Applications (VBA) tool to the code the program. UML was used for the system analysis and design only, giving an insight to the system architecture. The manual method of calculating the economic sustainability index is included in this research and verified by using the designed ESIC. The categories of key performance indicators (KPIs) included in this work are production-based, plant-layout-based, and quality-based.
11:50 Air Quality Monitoring through LoRa Technologies: A Literature Review
The impact that air quality has on a human being is significant. Millions of deaths are associated with air pollution and health problems. Several standards and policies to regulate and improve air quality have been established by both the World Health Organization and the Environmental Protection Agency. Currently, several authors proposed cost-effective devices for measuring air quality in real-time. The proliferation of low-cost options to measure air quality and acquire sensor data. Moreover, it is possible to develop novel techniques and use artificial intelligence to predict poor air quality scenarios. This paper presents a literature review on IoT architectures for air quality monitoring using LoRa communication technology. On the one hand, the objective of this study is to present a literature review to support future research initiates on the application of air quality monitoring system using LoRa. On the other hand, the main contribution of the paper is to present an overview on sensors, power source, communication, data storage, processing, and visualization technologies used in the proposed methods available in the literature.
12:10 Simulation Study of a Solar Glider Design
The ability of an Unmanned Aerial Vehicle (UAV) to fly for an extended period of time is an important issue in today’s world for various engineering applications. Taking into account a proper design process, a solar-powered aircraft could potentially fly for inordinate amounts of time continuously. The research aims at designing a light-weight solar endurance glider for an increased flight time by implementing vortex generators across the wingspan for improving its aerodynamics performance. The study utilized ANSYS 18.1 K-Omega SST turbulence simulation technique to successfully simulate the glider at different speeds along with various angle of attacks for aerodynamics optimization and ANSYS static structural module to observe the deformations and stresses on the wingspan. The findings show that the glider should be able to maintain a flight time of at least 6 hours with triangular vortex generators, sharklet winglets and 16 solar panels.
S2I: Decision Models in Human Resource Management 2
11:10 A Study on Machine Learning Classifier Models in Analyzing Discipline of Individuals Based on Various Reasons Absenteeism from Work
The main purpose of this paper is to analyze the discipline failure nature of the employees based on various reasons absenteeism from work. This analysis plays a vital role in performance evaluation of the employees time to time and also in finding the rate of discipline failures in any organization; the organization can incorporate evolving better measures for human resource management, ultimately the effect of absenteeism on the productivity aspect could be reduced. The data was collected from UCI machine learning repository. The data was pre-processed in such a way that the attribute “discipline failure” is the class attribute. In this paper, we used five classifier models in analyzing the data. The performance of those models was also compared to find out better options of model for analysis. From the experimental results it is quite clear that the SMO and multilayer perception models are the best models suited for analysis with 100% accuracy.
11:30 Talent Optimization in Faculty Recruitment in the Post-COVID-19 Era
This paper aims to present a multiple criteria decision model that supports faculty recruitment for business schools in the post-COVID-19 era. The greatest challenge that business schools are facing is to identify relevant criteria for faculty recruitment that can help to sustain their activities in today’s highly competitive and international market. Also, these criteria should be in line with the increased focus on faculty qualification by the 2020 Association to Advance Collegiate Schools of Business (AACSB) standards.
11:50 Employees with stigmatized identity suffer more! How much psychopath and uncivil leaders contribute?
Sobia Bano (OSA, Comsats University Vehari, Pakistan)
Purpose: This research aims to investigate our assumption that employees with a stigmatized identity are more likely to become the victims of their supervisor’s workplace incivility. Further, psychopathy’s moderating role between stigmatized identity and workplace incivility. Design/methodology: This research used non-probability sampling technique. Data were collected from supervisors and their subordinates serving in Multan’s four tertiary healthcare sector, 300 dyads of questionnaires used in this study. Findings: The empirical results show that stigmatized identity is an important antecedent of workplace incivility in the workplace. Current study findings also show that psychopathy positively predicts workplace incivility. Research limitations / recommendations: This study is not such an experimental nature, but follows a cross-sectional design. Future research may also focus on experimental design by incorporating time elements, and since multi-level analysis is not used in this study, design and analysis should be nested. In additionally, this study used only one form of Dark Triad (Psychopathy) to measure the behavior of supervisors researchers should also consider all three types of Dark Triad using. Practical and theoretical implications: This study highlights whether individuals are threatened in workplace or not when individuals have stigmatized identity. This study will help with organizational action to prevent rude behavior from spreading throughout the organization. The findings of this study have significant managerial implications and give directions for future research.Originality/value: Previous research emphasized more on consequences of workplace incivility but the present study offers its unique contribution by emphasizing on antecedents as stigmatized identity with workplace incivility as well as moderating role of psychopathy.
12:10 Understanding The Key Factors That Influence Employee Loyalty in Public Organizations
Ramy A. Rahimi (Chungnam National University, Korea (South))
The research study explores the key factors that influence employee loyalty in public organizations in South Korea. The study finds that employee empowerment, salary and rewards, training and development, and career advancement have a major and significant impact on employee satisfaction in which it has a strong and significant impact on employee loyalty. The research contributes to help understand the evolution and current state of the relationship between the employee and the employer in public organization. The study also serves as preliminary stage to investigate the impact of these factors on employees of the private sector in comparison with the public sector through a comparative analysis.
S3A: Decision Aid in Logistics and Engineering 4
14:00 Simulation and analytic hierarchy process to implement outpatient appointment system at Habib Bourguiba Hospital Sfax Tunisia
The orthopedic outpatient department in the CHU Habib Bourguiba currently uses an appointment system (AS) based on the single block rule. This system makes patients arrive early to the session. This is the most primitive form of AS where patient must wait much before the consultation and waiting rooms that suffer from overcrowding. In this paper, we used the simulation process to evaluate twelve appointment systems that are the result of the combination of four appointment rules and three scheduling rules while considering the no-shows as environmental factor. The evaluation is measured according to the average waiting time for the three categories of patients, the use of both doctors and the average number of patients in the service during the session. Since simulation results did not give a dominant system, An AHP model was created in order to choose among the twelve alternatives. The appointment system IBFI / ALTER was chosen as the most suitable for the studied service. A sensitivity analysis was performed, and the chosen system has demonstrated its robustness against the evaluation criteria.
14:20 Inflation Rate and Construction Materials Prices: Relationship Investigation
Muhammad Ali Musarat (Universiti Teknologi PETRONAS, Malaysia); Wesam Salah Alaloul (Bandar Seri Iskandar & Universiti Teknologi PETRONAS, Malaysia); Abdul Hannan Qureshi (Persiaran UTP, Seri Iskandar, Perak & Universiti Teknologi PETRONAS, Malaysia); Muhammad Altaf (Persiaran UTP, Seri Iskandar, Perak & Universiti Teknology Petronas, Malaysia)
Cost is one of the essential parameters of the construction project and any variation in it can bring the project to the failure. The Inflation rate is considered as a primary factor which causes cost overrun. To examine this scenario in the construction industry of Pakistan, this study has been performed. The materials prices were examined statistically with the inflation rate to evaluate their correlation. First, the percentage deviation was calculated to examine the variations in the prices. Afterwards, the Spearman correlation coefficient was calculated to investigate their interrelation. Based on the results it was revealed that inflation rate is the foremost factor in diverging the prices, as it was found to be strongly correlated. The strong relationship indicates that the shift in the inflation rate raises the prices of the materials, resulting in project cost overrun. Here, in this case, the relationship appeared to be inversely proportional. Therefore, the construction industry’s decision-makers need to come up with a solution to incorporate the inflation rate in cost estimation at the bidding stage.
14:40 Computational Modelling and Analysis of Fullerene Based Polymer Solar Cell
Waqas Farooq (Sarhad University of Science and Information Technology, Pakistan); Muhammad Ali Musarat (Universiti Teknologi PETRONAS, Malaysia); Wesam Salah Alaloul (Bandar Seri Iskandar & Universiti Teknologi PETRONAS, Malaysia)
This paper aims to present computational modelling of thin film fullerene-based Polymer Solar Cell (PSC) based on regioregular P3HT: PCBM, which is also known as organic solar cell (OSC). The structure is numerically modelled in such a way to boost the electrical photovoltaic parameters by modulating the thickness of active layer which is the combination of donor and acceptor (D/A) material. The obtained results strongly suggest that thickness of the active layer is one of most important and key parameter for obtaining high performance from the device. The simulated structure delivered the highest energy efficiency of 4.58% with open circuit voltage of 0.816 (V), and fill factor of 82.71%. The paper is also featured with the optical absorption of photons. The proposed architecture can be helpful for the potential applications of solar energy harvesting devices.
15:00 Improvement of graphical model of railway stations functioning
Dmytro Kozachenko (Dnipro National University of Railway Transport named after Academician V. Lazaryan, Ukraine); Anatoliy Verlan (LLC Transinvestservice, Ukraine); Ruslana Korobyova (Dnipro National University of Railway Transport named after Academician V. Lazaryan, Ukraine)
Improving the operation technology of railway stations and freight terminals is an urgent task for railway transport. In these conditions, an increase in the work efficiency of process engineers is associated with the introduction of special software tools for simulating station technology. The aim of this study is to improve the graphical model of functioning of railway stations and freight terminals. Methods of graph theory and object-oriented analysis were used as research methods. The model improvement was achieved through a formal description of the connections between objects servicing operations at railway stations, as well as adding a list of objects and technologies to the model. The graphic editor developed based on the proposed model allows reducing the workload on technologists when performing graphic works and freeing up their time for solving problems of improving station technology.
S3B: Medical Decision Making 3
14:00 Prediction Model for Type 2 Diabetes using Stacked Ensemble Classifiers
Norma Latif Latif Fitriyani, Muhammad Syafrudin and Ganjar Alfian (Dongguk University, Korea (South)); Agung Fatwanto (UIN Sunan Kalijaga, Indonesia); Syifa Latif Qolbiyani (Universitas Islam Negeri Walisongo, Indonesia); Jongtae Rhee (Dongguk University, Korea (South))
Diabetes is the number one of major causes of death globally. Undetected and untreated diabetes causes serious issues and the individuals with diabetes are at high risk for complication. Thus, an early diabetes prediction is necessary to help the individuals preventing dangerous conditions at the early stage. This study proposed a prediction model to offer early prognostication of type 2 diabetes. The proposed model incorporates isolation forest and synthetic minority oversampling-tomek link technique to detect as well as remove the outlier data, and balance the data distribution, respectively. The stacked ensemble classifiers are the used learn and predict type 2 diabetes at an early stage. We used three publicly available datasets to evaluate the performance of proposed model as compared to other models such as multi-layer perceptron, support vector machines, decision tree, and logistic regression. We applied 10-fold cross-validation and obtain four performance metrics such precision, recall, f-measure, and accuracy. The experimental results show that the proposed model outperformed other models, achieving accuracy up to 93.18%, 98.87%, and 96.09% for dataset I, II, and III, respectively. It is expected that the early diabetes prediction could help the individuals on taking precautions once type 2 diabetes is detected.
14:20 Exploring the role of Medical Decision Making in biotechnology field through science mapping
Andrea Castorena-Robles (Monterrey Institute of Technology and Higher Education, Mexico); Nadia Karina Gamboa-Rosales (Autonomus University of Zacatecas, Mexico); Alberto Faz-Mendoza (Autonomous University of Zacatecas, Mexico); Mariano Alberto Casas-Valadez (Autonomus University of Zacatecas, Mexico); Cesar Esau Medina (Universidad Autónoma de Zacatecas, Mexico); José Ricardo López-Robles (Autonomous University of Zacatecas, Mexico)
Biotechnology is defined as science that uses living organisms to enhance the human life and the environment. However, it’s not a new science as many people think. The first Biotechnology product registered is the cheese, and since then it has been part of daily life. Likewise, there are many genes related with cancer that contain SNPs, which are the most common variation in the human genome. The SNPs are found in genes related to important functions such as metabolism. However, these variations can cause amino acid substitution, leading to a malformation in the second structure of the protein, affecting its functions and interactions. Biotechnology field has increased exponentially during the last years. With this the knowledge of medicine has expanded, as well. In Biotechnology is important the management of information and its organization for a correct investigation. This leads to the creation of data-base companies, which make easier the information management. In this research, Medical Decision Making and Biotechnology through science mapping have been analyzed using bibliometric tools, in order to explore the scientific productivity and compare important aspect to consider in scientific research. Furthermore, to understand its importance in Medical Decision Support Systems, Genetics and Reproduction, Drug Industry, and Clinical Practice and Research.
14:40 iMED: Ubiquitous healthcare platform for chronic patients
Mohammed Anis Oukebdane (Faculty of Sciences and Technology, Mascara, Algeria); Abd Ellah Taib (Faculty of Sciences and Technology, Mustapha Stambouli University, Algeria); Samir Ghouali (STIC Lab, Algeria); Mohammed Seghir Guellil (POLDEVA Laboratory, Algeria); Walid Cherifi (InnoDev (Dev Software), Algeria); Amina Elbatoul Dinar (LSTE Laboratory, Faculty of Sciences and Technology, Algeria)
Chronic diseases represent a major threat to individuals and communities, as they are the most common diseases and causes of death in the industrial world as well as in the developing countries. According to statistics published by the Center for Disease Control and Prevention, the first five diseases that are most deadly to Americans are chronic diseases that are led by heart diseases that caused 23% of deaths in 2019. As for diabetes, it caused 83,564 deaths in the same year. New infections increase with chronic diseases annually, and the resulting terror increases due to the high number of deaths resulting from it. Our Android application called iMED, the purpose is to improve human’s life in general and the chronic patients especially by providing a medical platform that helps them to predict and detect diabetes and cardiovascular diseases. For our first phase (Heart diseases), we used deep learning under python to create a classification model, for the learning of this model we used the MIT-BIH database of PhysioNet. The accuracy of the model is 97.7%. In the Diabetic Prevision, The accuracy for PIMA Indians Diabetes Database is 79, 89%. Moreover, since creating a good relationship between the doctor and the patient is the first step in trusting the diagnosis and treatment that the latter is looking for, a smooth communication method has been added between them through text messages, phone calls and video calls.
15:00 Efficiency Analysis to evaluate a Breast Cancer Screening Campaign: A Case Study from Tunisia
Safa Bhar Layeb (National Engineering School of Tunis, University of Tunis El Manar, Tunis & UR-OASIS : Optimization and Analysis of Service and Industrial Systems Laboratory, Tunisia); Najla Aissaoui (LR-OASIS, National Engineering School of Carthage, Tunisia); Nouha Ben Fatma and Mohamed Frikha (National Engineering School of Tunis, University Tunis El Manar, Tunisia); Zied Jemai (University of Tunis Elmanar, Tunisia); Nadra Bohli (National Institute of Applied Science and Technology, Carthage University, Tunisia); Chokri Hamouda (National Authority for Assessment and Accreditation in Healthcare, Tunisia)
Despite the tremendous clinical advances, breast cancer remains one of the deadliest cancers for women around the world. This has led to a wide variety of awareness and screening programs in several countries, with often questioned real impacts. In this perspective, this multidisciplinary study was conducted to evaluate the effectiveness of the Tunisian national breast cancer screening campaign in 2019. The Data Envelopment Analysis (DEA) approach was used to measure the relative effectiveness of the governorates with respect to this campaign. Preliminary numerical experimentation reveals promising results and highlights venues for future studies.
S3C: Computerized Decision Aid 3
14:00 Asymmetric production metric for calculating the similarity of objects
Alexander Sorokin (Astrakhan State Technical University, Russia)
The purpose of this paper is to propose a new metric for determining the similarity of objects. The analysis showed that many existing metrics are focused on comparing objects, the parameters of which are estimated on the same type of scales. Examples are Euclidean metric, Minkowski metric, Manhattan metric. In addition, the described metrics do not allow setting different boundaries of deviations of the parameter values “above” or “below” from the values of the parameter of the sample object. The essence of the proposed metric is to check the deviation for each of the parameters between the value of the compared object and the sample object. Also, the metric provides operations for checking various additional conditions by which objects can be recognized as similar. As a result, the metric makes it possible to evaluate the similarity of objects, according to parameters that are described on the scale of relations, rank scale and scale of names, as well as additional conditions that are provided by the assessment of the similarity of objects. In the course of the experiment, the performance of the proposed provisions for determining the similarity metric of objects was confirmed. It was also shown that the groups of parameter values of a subset of objects, formed using the proposed metric, has less variance and a smaller range of scatter of values than similar groups of parameters of subsets of objects formed using Euclidean metrics and Minkowski metrics.
14:20 Forecast of the number of new patients and those who died from COVID-19 in Bahrain
Olena Mykolaivna Pavliuk (Lviv Polytechnic National University, Ukraine)
A review of the COVID-19 pandemic in Bahrain has been conducted. Correlations between the parameters describing the coronavirus pandemic have been established. Partially lost data was supplemented by polynomial functions, as well as by linear approximation. The number of those who suffered and those who died from COVID-19 was predicted using SGTM neural-like structure topologies supervised mode.
14:40 Application of Mobile Computer Digital Devise for Current Medical and Biological Control in Futsal
In order to conduct the current medical and biological control in futsal, the authors developed and introduced the new methods of assessing the functional state of athletes based on the comprehensive analysis of the electrical activity of the heart by means of digital electrocardiograph Cardioplus P6 and software package “Oracle”. The implementation of the methodology allowed reducing the sports traumatism in the team by 30% and improving the team’s tournament achievements. This research showed the necessity of using information and computer technologies in futsal as a constituent part of the training process.
15:00 Gender Prediction for Instagram User Profiling using Deep Learning
Adri Priadana (Universitas Jenderal Achmad Yani Yogyakarta, Indonesia); Muhammad Rifqi Maarif (Jenderal Achmad Yani University Yogyakarta, Indonesia); Muhammad Habibi (Universitas Jenderal Achmad Yani Yogyakarta, Indonesia)
Instagram creates new opportunities for small businesses to expand their markets. It can be used by entrepreneurs to reach potential customers and tell them about their products. Hence, knowing the user’s demographics on Instagram is essential to convert those users into potential buyers. The demographics of social media users, such as gender, are vital for personalized advertising targeting. Based on the previous study, most of them studied to predict gender based on text analysis. This study aims to implement a deep learning method called Convolutional Neural Network (CNN) to predict gender on Instagram based on an Instagram profile image. Deep learning is a widely known technique to extract hidden patterns of a specific image. Thus, it can be useful for detecting gender based on the Instagram user’s profile images. According to performance analysis, the prediction of gender on Instagram based on profile pictures using CNN resulted in an accuracy value of 70.11%. Compared with research related to gender prediction based on text analysis in previous studies, this study’s accuracy results are not better than the four different studies in the state-of-the-art. However, this study’s accuracy results are better than the two other studies in the state-of-the-art. Furthermore, this study proves that gender prediction based on image analysis using the CNN method can be performed exceptionally well, especially image analysis based on the image profile of Instagram users.
S3D: Financial Decision Making 1
14:00 Determining Ownership Structure Threshold as a Basis for Financial Decision Making: A Panel Quadratic Regression Model
This research examines the agency conflicts to the relation between ownership concentration and firm performance in Indonesia. A 580-firm listed in the Indonesian Stock Exchange over the 2009-2018 period are investigated and analyzed using a panel quadratic regression model. The study documented that the behavior of the largest shareholders was inconsistent. The positive effect is found to be dominant when the ownership concentration was less than 74%; the finding supported the efficient monitoring hypotheses. Meanwhile, the expropriation effect by the largest shareholders occurred at the ownership level above 74%. These findings suggest that regulators and policy-makers should pay attention to the ownership concentration of the largest shareholders to minimize agency problems on public firms in Indonesia.
14:20 Graphical Disclosure Practices in GCC Countries: A Descriptive Approach
The study describes graphical disclosure practices in 2256 annual reports of 510 GCC listed firms. In order to achieve this, a checklist has been developed to collect information about graph characteristics through annual reports of GCC listed firms in terms of type of chart, orientation of chart, chart dimension, period graph, the frequency of graph use, the use of color, and type of data (graph objectives). In addition, information on the use of graphs in GCC, types of graphs used in the annual reports of GCC listed firms, the GCC countries’ presentation of key performance variables (KPV) graphs.
14:40 Stock Price Prediction Using CNN and LSTM-Based Deep Learning Models
Designing robust and accurate predictive models for stock price prediction has been an active area of research over a long time. While on one side, the supporters of the efficient market hypothesis claim that it is impossible to forecast stock prices accurately, many researchers believe otherwise. There exist propositions in the literature that have demonstrated that if properly designed and optimized, predictive models can very accurately and reliably predict future values of stock prices. This paper presents a suite of deep learning-based models for stock price prediction. We use the historical records of the NIFTY 50 index listed in the National Stock Exchange (NSE) of India, during the period from December 29, 2008 to July 31, 2020 for training and testing the models. Our proposition includes two regression models built on convolutional neural networks (CNNs), and three long-and-short-term memory (LSTM) network-based predictive models. For the purpose of forecasting the open values of the NIFTY 50 index records, we adopted a multi-step prediction technique with walk-forward validation. In this approach, the open values of the NIFTY 50 index are predicted on a time horizon of one week, and once a week is over, the actual index values are included in the training set before the model is trained again, and the forecasts for the next week are made. We present detailed results on the forecasting accuracies for all our proposed models. The results show that while all the models are very accurate in forecasting the NIFTY 50 open values, the univariate encoder-decoder convolutional LSTM with previous two weeks’ data as the input is the most accurate model. On the other hand, a univariate CNN model with previous one week’s data as the input is found to be the fastest model in terms of its execution speed.
15:00 Corporate Governance and the Insolvency Risk: Evidence from Bahrain
The purpose of this study is to examine the impact of corporate governance (CG) mechanisms on the likelihood of firms having high, low, or moderate insolvency risk. Due to the trichotomous categorical nature of the dependent variable, the multinomial logistic regression analysis is used to determine the impact of CG mechanisms on the firms’ insolvency risk. Board structure and audit committee structure are used as attributes of CG, while the Altman Z-Score is used as a measure of insolvency risk. The results show that CG mechanisms, namely, the board size, audit committee size, and audit committee meetings have a significant impact on the firms’ insolvency risk. However, board meetings, board independence, and audit committee independence do not have any significant impact on insolvency risk. The findings of this study could provide awareness to managers on the relationship between the CG of the firm and the insolvency risk, with respect to Bahraini firms, which could assist them in assessing the probability of insolvency. This study could also be useful from the policy perspective for firms to take necessary measures and precautions to avoid financial distress and extend the continuity of the company by constructing long-term strategies to enhance the CG structure.
S3E: Energy Management Decisions 1
14:00 Stability Inspection of Isolated Hydro Power Plant with Cuttlefish Algorithm
The load frequency control of any system not merely depends upon the controlling technique although on the controller structure. The application of proportional-integral-derivative (PID) and proportional-integral-derivative with filter (PIDN) controller structures for load frequency investigation of isolated power generation plant is addressed in this paper. The dynamic response of the intricate control system is examined in simpler way by considering single power source. With an objective to obtain the stable operation of an isolated hydro power plant choice between PID and PIDN controller structure with cuttlefish algorithm is presented. The cuttlefish algorithm is optimizing the controller parameters to maintain the system stability. The robustness assessment of proposed algorithm validates the ability of algorithm to perform remarkably even with wide range of amendments in system parameters, random variations in power demand etc.
14:20 Towards the assessment of the efficiency and sustainability of energy-based Investment projects
Reducing carbon emissions is one of the main strategies for the development of a more sustainable production system worldwide. To align with that strategy, industries must try to reduce energy consumption or be as efficient as possible in its use. In this work, an energy efficiency project is approached and its aims is to improve the use made of energy within the industrial plant under study. In this plant, the incorporation of an electricity generation turbine is planned which will be powered by the plant’s steam system. Thus, the selection of which turbine to purchase for the energy efficiency project is a complex decision problem. For this, a financial analysis is proposed through the Net Present Value (NPV), using information on the standard behavior of the steam system. The results allow us to clearly identify which is the best option, showing the benefits of NPV for a markedly technical analysis.
14:40 An Effective Solution to Unit Commitment Problem in Presence of Sustainable Energy Using Hybrid Harris Hawk’s Optimizer
In the present scenario of power system, complexity of unit commitment problem increases with the large penetration of intermittent energy sources. Harris hawk optimizer (HHO) is recently developed optimizer which efficiently solves engineering and optimization problems. Local minima entrapment is common and major issue of these optimization techniques. In this research search capability of Harris Hawks optimization is enhance by incorporating improved grey wolf optimizer (IGWO) for solving the UC in presence of sustainable energy source. The newly developed technique is tested for a 10 unit generating unit system with 5 % and 10 % spinning reserve and also compared with solar and wind power.
15:00 State of the Art: Laser Surface Texturing for Biomedical Applications
Ishwer Shivakoti (SMIT, Majitar East Sikkim, India)
The paper depicts the in depth review of laser surface texturing on engineering materials considering its application in the field of biomedical applications. Sufficient literature survey has been done and the important aspects has been critically reviewed. The effect of laser surface texturing on various parameter of the biomedical application has been reviewed and presented in the concise form. The review suggested that laser surface texturing has a significant influence on the engineering materials in terms of bio medical application.
S3F: Management Decision 3
14:00 Intelligent processes in the context of Mining 4.0: Trends, research challenges and opportunities
Alberto Faz-Mendoza (Autonomous University of Zacatecas, Mexico); Nadia Karina Gamboa-Rosales (Universidad Autonoma de Zacatecas, Mexico); Cesar Esau Medina (Universidad Autónoma de Zacatecas, Mexico); Mariano Alberto Casas-Valadez (Autonomus University of Zacatecas, Mexico); Andrea Castorena-Robles (Monterrey Institute of Technology and Higher Education, Mexico); José Ricardo López-Robles (Autonomous University of Zacatecas, Mexico)
Industry 4.0 is a frontier topic of knowledge and is inserted into the innovation ecosystem thanks to the inclusion of information and communication technologies, supported by the internet of things and cyber-physical systems applied to industrial processes. The fourth industrial revolution describes the set of continuous transformations in the systems that surround us, it is also known as industry 4.0 and in the last five years, it has become a primary issue in the industrial economy. Industry 4.0 makes possible high potential scenarios in the industry, allows the disruption of a new industrial model. The convergence of new technologies transform sectors and markets, and the digitization of companies represents a radical change in the way of doing business. Mining is one of the sectors that have been supported with technologies associated with industry 4.0; these technologies have promoted the conditions for making strategic decisions using intelligent processes whose objectives are the digitization of the mining process and consolidation of the competitive position in the sector. This article links the concept of Industry 4.0 with that of Mining 4.0, it focuses on the discussion of trends, challenges, opportunities and technologies that the mining sector requires to successfully face them.
14:20 Performance Evaluation of Pulp and Paper Mills: Bootstrap Data Envelopment Analysis Approach
The pulp and paper industry converts roundwood and recycled fibre, collected from wastepaper into printing and writing papers. The pulp and paper mills in Ontario have been facing extreme competitive pressures, which have affected their performance leading to several mill closures. The purpose of this study is to evaluate and compare the relative performance of three types of Ontario’s pulp and paper mills (using all fibre, only roundwood fibre, and only recycled fibre). This study uses bootstrap data envelopment analysis and the results indicate low levels of overall technical and managerial efficiencies. The results of this study provide policy makers with detailed performance analysis so that future input resources can be reallocated to improve the performance of the pulp and paper mills in Ontario. It is recommended that the pulp and paper mills using recycled fibre require huge capital investments to install de-inking technology to improve performance.
14:40 A multi-criterion decision analysis based on PCA for analyzing the digital technology skills in the effectiveness of government services
Prabhat Mittal (University of Delhi & Satyawati College (E.), India)
Recent decades have witnessed the increased use of advanced technology, new innovations, AI-startups enhances the efficiency and the effectiveness of the government in public services. To sustain in the competitive environment and meeting expectations of the citizens’ and business in their country, governments are continuously analyzing their strategies to choose the right set of digital technologies. The objective of this research is to suggest an optimal strategy using MCDM-AHP (analytic hierarchy process) decision model based on principal component analysis (PCA) for the government to improve their public service delivery. The study selects and analyzes top 100 countries according to Governments’ Artificial Intelligence Readiness index score provided by the Oxford Insights and IDRC 2019. Analysis reveals that the improvement in data capabilities should be preferred over the other alternatives available such as digital public services, procurement of advanced technology, data capabilities and innovation capabilities of a nation can be significant in effectiveness of the Government in public services.
15:00 Application of FinTech, Machine learning and Artificial Intelligence in programmed decision making and the perceived benefits
Mohammad Selim (UOB, Bahrain)
The objective of this study is to examine the perceived benefits of the application of FinTech, Machine Learning and Artificial Intelligence (AI) or (FMAI) in programmed decision making for the consumers, producers or employers. The study is based on a theoretical model, developed step by step to explain consumers’ satisfaction and employers’ benefits when the company or the employer introduces FinTech, Machine learning and AI for implementing the policies in programmed decisions. The results and findings show that the application of FinTech, Machine learning and AI will maximize the consumers’ satisfaction and employers’ benefits. The application of FMAI saves time for the consumers, minimizes the number of trips to the offices, and reduces the confrontations with the unpleasant customer service representatives. FMAI are user friendly, and it has the potential to increase consumers’ satisfaction as well as employers’ benefits by nicely settling the issues with the consumers and other stakeholders. The consumers remain the focal point and will make sure that consumers will not desert the company. This is perhaps one of the latest studies which blends FMAI with programmed decision-making process and shows perceived benefits for both consumers and producers and thus society’s total wellbeing will be the maximum.
S3G: Decision Making Using IoT and ML 3
14:00 Value-Based Adoption Model on E-Wallet in Malaysia: A Conceptual Paper
In the current modern era of accelerated technological progression, e-wallet services are budding rapidly on a global scale and Malaysia swiftly moving towards a cashless society. This conceptual paper aims to shed lights to the body of knowledge and market practitioners by examining the determinants of perceived value and adoption intention, grounded by Value-based Adoption Model. It is believed that the findings of this research will shed lights into important issues related to consumers’ adoption intention towards e-wallet in Malaysia that has not been focused by previous studies. From the theoretical point of view, this research will expand the understanding on the factors that affecting new technology adoption from the perspective of consumers. From the managerial implication perspective, e-wallet service providers should create an impression of desirable benefits if a positive impacts of perceived benefits is found. To be precise, higher perceived value indicates higher intention to adopt e-wallet, which indicates that the adoption intention is not only triggered by extrinsic benefits, but also the intrinsic results of using e-wallet. Hence, market practitioners should conduct market research regularly in order to observe consumer needs and wants and improve the products and services to meet consumers’ expectations.
14:20 Indoor Air Quality Monitoring with IoT: Predicting PM10 for Enhanced Decision Support
Jagriti Saini (National Institute of Technical Teachers Training and Research, Chandigarh, India); Maitreyee Dutta (National Institute of Technical Teachers Training & Research, Chandigarh, India); Gonçalo Marques (Polytechnic of Coimbra, Portugal)
Indoor air pollution is one of the major environmental hazards. None of the living beings can survive without air. However, at the same time, breathing in poor air can also cause several serious health issues. Air quality is critical for enhanced health and well-being. Hence, a reliable system to monitor indoor air quality (IAQ) is required. Advances in the prediction of pollutant conditions in the building environment can help occupants to take preventive actions. These interventions can avoid poisonous situations in useful time. This paper describes the functionality of Internet of Things based IAQ monitoring systems for measuring PM10, PM2.5, CO2, VOC, Temperature and Humidity parameters. The main contribution of this study is to propose an automated prediction system for PM10. The XGBoost Regressor is applied to predict PM10 levels for enhanced decision support. The performance of the prediction system is measured in terms of six relevant regression parameters. The obtained values are RMSE=0.44, R2 Score=0.99, MSE=0.197, MAE=0.334, MAPE=3.45% and Prediction Accuracy = 97.93%. The results show the efficiency of the proposed system to support building managers and occupants to prevent critical consequences associated with poor air quality.
14:40 Cognitive Computing in Software evaluation
Due to the rapid growth of software field, there is a tremendous increase in the building a software, websites and also applications. Eventually with the increase of the software products, the standards of the product also play a vital role. Based on the good standards of the product, the level of that particular product will always be high. Software evaluation is a method whose main objective is to judge a product and to verify whether the software meets all the prerequisites. Cognitive computing is a combination of several different fields and it imitates the human intellectual capacity in making decisions. The proposed work describes about the software evaluation using cognitive computing which is very inventive with wide range of benefits and provides excellent result. It also improves the efficiency of the company and helps to build advanced software’s. The results are very accurate and saves lot of time in performing the software processes.
15:00 Data Augmentation to Improve the Performance of a Convolutional Neural Network on Image Classification
Deep learning has become a fundamental tool to extract meaningful information from big data. However, it needs a huge amount of high-quality data to build an accurate classifier. In many situations, the size of the training dataset is not sufficiently large to effectively train a model. This paper presents a Convolutional Neural Network trained on a very small dataset, discussing the impact of data augmentation, feature extraction and fine-tuning on the accuracy of the model. The results show that having a small dataset, those approaches are very effective when dealing with image data.
S3H: Sustainable Decisions for a Sustainable Development 3
14:00 Housing Infrastructure Resilience Framework Development for Sustainable Future
Resilience is the enduring capacity of infrastructure systems against natural disasters and to reinvigorate to stipulated performance after the happening of such disasters. A properly oriented study for vulnerability, risk, and resilience assessment are currently lacking for infrastructure systems in developing countries and for the housing infrastructure system in particular. The paper aims to address this gap by reviewing the real field data and developing a framework for evaluating the resilience of housing infrastructure against flood hazards. The proposed framework is developed by integrating of Best Worst Method (BWM) and Weighted Sum Method (WSM), where BWM is used to find the weightage of each parameter, and WSM is used to calculate the resilience of hosing infrastructure.
14:20 Ensemble Learning Algorithm-based Artificial Neural Network for Predicting Solar Radiation Data
Mohammed ali Jallal (Cadi Ayyad University Faculty of Sciences Semlalia, Morocco); Abdessalam EL yassini (Cadi Ayyad University Faculty of Sciences, Semlalia, Morocco); Samira Chabaa (Ibn Zohr & ENSA, Morocco); Abdelouahab Zeroual (Cadi Ayyad University, Morocco); Saida Ibnyaich (Faculty of Sciences Semlalia, Cadi Ayyad University Marrakesh, Morocco)
Trustworthy acquaintance and availability of accurate solar radiation measurements are a condition for designing and managing solar energy systems. Frequently, there are huge spatial and temporal lacks of measurements so that predictive approaches become of interest. In the present paper, an ensemble learning approach is proposed, which is based on artificial neural network (ANN) technique. The proposed approach is applied to forecast the hourly time series of global solar radiation that is related to the city of Marrakech (latitude 31°37′N, longitude 08°01′W, elevation 466m), Morocco. The developed forecasting model was trained using seven years of measurements with an hour resolution via the efficient optimizer of levenberg-marquardt. Five exogenous inputs are involved for performing the prediction task, including air temperature, relative humidity, precipitation, wind speed, and time feature. The achieved outcomes proves the consistency and the accuracy of the proposed learning approach to generate synthetic solar radiation data in case of need.
14:40 Promoting Green Behavior in Laboratories: A Conceptual Paper
Employee’s green behavior plays a pivotal role in reducing waste in the environment and sustaining the natural resources for the next generation. It is also significant in scientific labs due to the substantial energy consumption and waste generated in labs. Based on the existing literature, this paper re-caps the definitions of green behavior in workplace stated by various researchers. Scales developed by previous researchers to measure the employee’s green behavior are discussed in this paper.
15:00 Strategic Intelligence and Knowledge Management as drivers of Decision-Making in Mining Industry: An analysis of the literature
Alberto Faz-Mendoza (Autonomous University of Zacatecas, Mexico); Nadia Karina Gamboa-Rosales (Autonomus University of Zacatecas, Mexico); Andrea Castorena-Robles (Monterrey Institute of Technology and Higher Education, Mexico); Manuel Jesus Cobo (University of Cadiz, Mexico); Rodrigo Castañeda-Miranda and José Ricardo López-Robles (Autonomous University of Zacatecas, Mexico)
One of the manifestations of the knowledge society is the increase in areas of uncertainty, transforming ignorance, understood as the ignorance of non-knowledge, into uncertainty, understood as the knowledge of non-knowledge. Knowledge management and strategic intelligence are transformative activities of society, implicitly connected with innovation in sustainability and business changes that result in greater organizational potential, and in a knowledge economy. The objective of Intelligence is to obtain value from information by analyzing a large amount of data. Mining and mineral extraction processes are a sector that due to its complexity requires tools for decision-making that ensure efficiency and competitive position without forgetting the stakeholders. Therefore, this study performs a conceptual and bibliometric analysis of production in the area of Knowledge Management and Intelligence quantifying performance indicators, identifying the main authors and research areas, and evaluating the development of the field using VOSviewer as bibliometric analysis software.
S3I: Decision Models in Marketing 1
14:00 Corporate University: The Vital Human Resource and Marketing Decision-Making Model in Knowledge-Economy Today’s Competitiveness
Waleed A. Aziz (University of Bahrain, Bahrain)
Corporate University can be defined as the educational entity which also acts as a tactical tool designed for assisting its parent organization for achieving desired objectives while conducting different activities that help in encouraging individual as well as organizational knowledge and learning (Moore 1997, p.77). Corporate Universities are also referred as the public universities which have been developed by the states for developing the corporate style behavior (Homan & Macpherson 2015, p.75). Corporate University limits the scope of generating job-specific, indeed business-specific, training of the managerial personnel of the parent organization (Buchbinder & Newson 1990, p.355). The main goal of the corporate university is to bring a unified culture, loyalty, as well as a sort of belongingness into the company and also to remain competitive at the same time. The study will try to explain how CU can become a reliable decision-making model in terms of marketing management strategies. The main purpose of this study is to discuss the need for corporate universities in the present businesses to remain competitive in the market. Human Resource Management (HRM) has been experiencing several transformations over the years. The attention has changed from different administrative management tasks to become a strategic partner in the overall organizational strategy with essential provision from the information technologies progress in the field of knowledge area. The research aims to address how marketing decisions and human resource models can help businesses to gain competitive advantage.
14:20 Intelligent Computation to build a Novel Recommender of Products through (PageRank-Clustering and DgSpan- FBR)
Samaher Al-Janabi (Babylon University, Iraq)
Data is considered as one of the worthiest existent resources in the world. The fast development of technology provides us with huge/big data that needs deep analysis to understand and extraction useful information\knowledge from it. Thus, analysing such kind of data resembles a great challenge for researchers in the field of intelligent computation. The proposed model consists of three stages: the first stage is the collecting and preparing data (i.e., includes handle missing values and data convert into graphs). The second stage generates communities from graphs by developing one of the graph mining algorithms called (PageRank Clustering & Verification). Third stage includes building association patterns to the optimal clusters (subgraphs) and validating from it through developing one of the association pattern algorithms called Develop gSpan-Forward\ Backward Rules (DgSpan-FBR). The design model called Novel Recommender of products based on Graph Mining (NRGM) is distinguished by compressing the lost time of searching the best products that users need as well as increasing the accuracy of selecting of products. NRGM model is applied using on one of huge/big database have six parts different in number of samples, NRGM processed database to generated a set of communities through PageRank-Clustering. Then verification from these communities by apply two measures called (i.e., Modularity and Silhouette) to find optimal community. We test different number of communities (i.e., 2, 4, 6, 8,10 and 12) Then found two communities are the optimal where the Modularity of it is 0.679 while, silhouette is 0.699. Finally, a set of association patterns is found from the optimal community through both traditional association graph mining method called (gSpan) and developed method (DgSpan-FBR), where, the traditional method generates 3000 pattens for each community. While DgSpan-FBR generates 262 patterns. As a result, NRGM is consider as pragmatic model to deal with hug/big dataset, at the same time, it reduces the searching time and increase the accuracy of results.
14:40 The Areas and Requirements of Competitiveness Advantage in Jordanian Universities
The present study aimed to explore the areas, which Jordanian universities should invest in to achieve a competitive advantage. It aimed to identify the requirements of achieving a competitive advantage by Jordanian universities based on the strategic plans of these universities. A qualitative approach was adopted. Interviews were conducted with 10 mangers of the top management level in private and public universities in Jordan. The researchers found that (research, innovation, finance, academic reputation, student attraction, and technology) are the most prominent areas that must be invested in to achieve a competitive advantage. The requirements of achieving competitiveness advantage include: offering programs of modern majors. They include: e-curricula, providing e-learning survives, supporting scientific research, developing the infrastructure, and attracting highly qualified professionals. They include: enjoying financial independence, achieving financial sustainability, and getting high rank on global universities ranking. Meeting such requirements shall enable Jordanian higher education institutions to achieve a competitive advantage. The present study is beneficial for decision makers and investors in the Jordanian higher education institutions. It enables those decision makers and investors to provide more attention to the aforementioned areas and requirements in order to achieve a competitive advantage. Hence, it contributes to developing national economy.
15:00 Decision Models in Marketing: The role of Sentiment Analysis from bibliometric analysis
Mariano Alberto Casas-Valadez (Autonomus University of Zacatecas, Mexico); Alberto Faz-Mendoza (Autonomous University of Zacatecas, Mexico); Cesar Esau Medina (Universidad Autónoma de Zacatecas, Mexico); Andrea Castorena-Robles (Monterrey Institute of Technology and Higher Education, Mexico); Nadia Karina Gamboa-Rosales (Autonomus University of Zacatecas, Mexico); José Ricardo López-Robles (Autonomous University of Zacatecas, Mexico)
With the exponential growth in the use of social networks and the massive generation of information on the internet, by 2020 it is expected that more than half of the world’s population will have access to the internet and 84% will use a social network. These networks act as meeting points between people, create links and virtual communities with common interests or activities and allow contact between them, so they can communicate and exchange information; for its range and development, these digital media have great potential as a tool for companies to get to know their customers and with the increase in social media activities, emotions are seen as valuable products from a commercial perspective. By carefully evaluating people’s opinion and feelings, companies can reasonably find out what people think about a product or service, and consequently make the most appropriate decision. The tool that support us to extract this information is the Sentiment Analysis and to learn more about it a review was developed based on a Bibliometric Analysis to know more about its evolution, trends, research areas, authors and outstanding publications.
S4A: Decision Aid in Logistics and Engineering 5
15:30 Statistical and Probabilistic-based Decision aids for Offshore Wind Turbine Reliability
The reliability of a wind turbine or any system is a key parameter in decision-making to ascertain the level of assurance the system can actively remain in operation. It also provides support to make reasonable cost-benefit analysis, certain safety legislations required of the equipment as well as deciding suitable maintenance requirements or strategy for effective performance of the system. The use of accurate statistical data and probabilistic method is an effective aid to capture uncertainties that may lead to sudden failure of the system and consequently provide effective mitigation plans. Various classical and structural reliability methods to aid decision making are outlined in the paper, a step-by-step example in assessing the reliability of an offshore wind turbine is then provided.
15:50 Stochastic-based Deterioration Modeling of Elevators in Healthcare Facilities
Reem Ahmed (Concordia University, Canada)
Deterioration and aging associated with building assets are becoming major concerns in most countries as their building portfolios continue to increase and expand. Healthcare facilities are a special case of building assets that inherit a significant criticality and complexity within its operation and maintenance regimes which makes monitoring the assets’ condition and forecasting their life expectancy two of the most essential functions in a healthcare environment. In this paper, a stochastic deterioration prediction approach was developed to model and estimate the degradation of elevators systems within hospital building environments due to their importance to the continuity of the hospital mission and services. Different probability distributions were fitted using historical condition data and the performance of different distributions was then compared utilizing the Anderson-Darling test. Parameters of the best distribution were thus found using maximum likelihood estimate. The developed model is expected to aid decision makers in improving the planning process for their maintenance and rehabilitation programs and to efficiently conduct proactive maintenance activities in a timely manner which helps ensure the sustainability of hospital operation.
16:10 Neutrosophic-AHP-based GA Model for Renewals Planning of Hospital Building Assets
Reem Ahmed (Concordia University, Canada)
Healthcare building infrastructure in Canada is currently facing two problems: Aging and Deferred Maintenance, leading to an increase in unexpected failures causing interruptions in the hospital operation which in turn affects the health and safety of its occupants. Despite the efforts exerted to overcome this, solutions cannot be easily implemented as they are often faced with limited and insufficient funds. Therefore, previous researches have experimented ways targeting a reduction in the rehabilitation costs while sustaining an acceptable physical condition of hospital assets. However, there is more to a building’s performance than its physical condition. Hence, this study assesses the hospital performance by including functional parameters of components rather than solely evaluating their physical condition on the basis of an integration between Neutrosophic Logic and Analytic Hierarchy Process, and accordingly improves the rehabilitation decisions by utilizing the output from the previous model as an objective for a genetic algorithm optimization model to prioritize rehabilitation activities within a limited funding allowance. The developed model was validated by applying it to a real hospital situation where the results obtained from the model were compared to the actual output attained from rehabilitation works inside the hospital facility, and the model developed in this study outperformed the current practice by an improvement of 34%. This framework is expected to aid decision-makers in efficiently allocating rehabilitation funds to the most critical hospital building systems which in turn improves the performance and availability of hospital assets.
16:30 Realization of Low Scattering for a High-Gain Planar Antenna Using an Artificial Dual-Layer Metasurface
Abdessalam EL yassini (Cadi Ayyad University Faculty of Sciences, Semlalia, Morocco); Mohammed ali Jallal (Cadi Ayyad University Faculty of Sciences Semlalia, Morocco); Saida Ibnyaich (Faculty of Sciences Semlalia, Cadi Ayyad University Marrakesh, Morocco); Abdelouaheb Zeroual (Cadi Ayyad University Faculty of Sciences, Semlalia, Morocco); Samira Chabaa (Ibn Zohr & ENSA, Morocco)
A novel planar antenna loaded of the dual-layer meta-surface (MS) is given in this study. The suggested antenna structure includes a planar radiating element with a circular form loaded of the MS. The main motivation of this study is to increase the radiation pattern of the conventional antenna. The experimental results were carried out to validate the simulation of the suggested antenna. The experimental results confirm that the operating frequency and band of the suggested antenna are 2.44 GHz and 2.37-2.49 GHz while the gain is up to 7 dBi. In addition, the simulation and measurement results have good coherence.
S4B: Medical Decision Making 4
15:30 Optimal Selection of Pyramid Pooling Components for Convolutional Neural Network Classifier
There has been an increasing number of researches on deep learning applications in screening eye diseases. Since most of the screening tools will be deployed for rural areas that lack modern medical equipment, it is best if the screening algorithm can be deployed on a mobile platform. Thus, a lightweight deep learning model very much fits the mobile platform as it requires low memory storage and imposes a low computational burden on the hardware. Therefore, a modified ShuffleNet V1 network is proposed to screen eye diseases using fundus images. A spatial pyramid pooling module is integrated at the exit flow of the network, such that the memory usage remains relatively the same but with improved classification accuracy. The better performance can be attributed to the exit module that better extracts the features from various scales rather than a simple global average pooling operator. The best mean accuracy of 0.7564 is obtained when maximum down-pooling operators are used with a kernel set of 2, 4, and 7. Furthermore, a wrong selection of hyper-parameters such as the case of kernel set 5, 6, and 7 also with maximum pooling operators return the lowest accuracy of just 0.7011. Therefore, an optimal selection of kernel set and down pooling operators is vital in improving classification performance. The proposed lightweight model can be further improved by using a separable convolution scheme, which a factorized version of a regular convolution.
15:50 An Integrated Weighting-based Modified WASPAS Methodology for Assessing Patient Satisfaction
Satender Pal Singh (Indian Institute of Management Ranchi, India); Tithishri Kundu (Bankura Sammilani Medical College, India); Arnab Adhikari (Indian Institute of Management Ranchi, India); Sumanta Basu (Indian Institute of Management Calcutta, India)
Due to the increased growth and high competition in the health care sector, it is important for the health care service provider to provide superior service experience to the patients. The improvement in the patient experience enhances their satisfaction level. In the context of patient satisfaction, it is crucial to identify the essential factors and assign weights to them. We propose an integrated weighting approach to allocate weights to the factors by integrating the weights obtained from the different objective weighting methods. It will overcome the issue of relying on a single specific weighting method. A modified weighted aggregated sum product assessment method (MWASPAS) is used to determine the patient satisfaction score by using the weights obtained from the integrated weighting method under the weighted product model (WPM) and weighted sum model (WSM). WASPAS method is used over other MCDMs method in our study due to the absence of any conflicting variable and complex calculation. Then, we determine the single patient satisfaction score by using the patient satisfaction score of the WSM and WPM method along with their respective weights. We apply the proposed methodology to the real-life data collected from the health care provider in Kolkata, India. The results indicate the vaiation in the weights assigned to the factors and final scores across the weighting methods. Also, the score obtained from WASPAS method gives a balance between the score determined by WSM and WPM method by taking into account the effects of both the methods.
16:10 Development of a collaborative decision-making framework to improve the Patients’ Service Quality in the Intensive Care Unit
Healthcare is one of the biggest and complex service sectors, where the decision needs to be taken quickly, accurately and effectively. Especially service improvement decisions in Intensive Care Units (ICU) which are considered to be a predominant factor. In this study, a collaborative decision-making framework is developed to improve the Patients’ Service Quality in the ICU including multiple stake-holders. The key criteria, alternatives that can advance the service quality in the ICU are identified from in-depth literature analysis. In this study, the best-worst method (BWM) is integrated with Multi-Actor Multi-Criteria Analysis (MAMCA) method to capture the stake-holder’s opinion. This integrated framework allows stake-holder groups to participate in the decision-making process and to select an effective strategy to improve the patients’ service quality. The result shows that hiring part-time physicians and medical staff, and hiring full-time physician and medical staff are the best solutions to improve the service quality in ICU based on physician and patient stake-holders, respectively.
16:30 WEB Predictor COVIDz: Deep Learning for COVID-19 Disease Detection from chest X-rays
Mohammed Seghir Guellil (POLDEVA Laboratory, Algeria); Samir Ghouali (STIC Lab, Algeria); Emad Kamil Hussein (Al-Furat Al-Awsat Technical University & Al-Mussaib Technical College, Iraq); Mohammed Anis Oukebdane (Faculty of Sciences and Technology, Mascara, Algeria); Amina Elbatoul Dinar (LSTE Laboratory, Faculty of Sciences and Technology, Algeria); Walid Cherifi (InnoDev (Dev Software), Algeria); Abd Ellah Taib and Boualem Merabet (Faculty of Sciences and Technology, Mustapha Stambouli University, Algeria)
While writing these words, the number of COVID-19 infected persons exceeded 20 730 456 and caused 751 154 deaths across the world as reported by WHO (World Health Organization) statistics . The matter has become a reality and the damage is very severe, there is no longer any way to save humanity from this epidemic except diagnose and prevention, especially with the delay in the emergence of any vaccine recognized by the World Health Organization so far. Without therapeutic treatment or explicit restorative immunizations for COVID-19, it is fundamental to distinguish the malady at a beginning phase and to have the option to quickly seclude a contaminated patient. This study, therefore, looked at the diagnostic value and consistency of chest imaging. Access to imaging is not always possible, accessible, or feasible. Our application solves this problem and from a WEB Predictor COVIDz and a program with deep learning we will be able to systematically bring the chest X-ray image and predict the percentage of absence or presence of COVID-19. The proposed approach (Custom VGG model) and our WEB site COVIDz objective validation of the suggested solution obtained the best classification efficiency 99,64%, F-score of 99,2%, Precision of 99,28%, MCC of 99,28%, recall of 99,28%, and a Specificity value of 100%.
S4C: Computerized Decision Aid 4
15:30 A Benders Decomposition Approach for Low Rank Concave Minimization Problems
We propose a Benders decomposition approach for an important class of mixed-integer concave minimization problems that is particularly suitable for problems with few concave terms, i.e., low-rank problems. Unlike Generalized Benders decomposition where the nonlinearity is handled at the subproblems, we handle concavity at the master problem and use a unique property of concave minimization to carry out an implicit enumeration. To our knowledge, this approach is the first to tackle concave minimization problems via Benders decomposition. We test and benchmark the proposed approach against state-of-the-art commercial solvers and found it to outperform them in many cases in terms of computational time and/or solution quality.
15:50 Ontology-Based Approach to Determine the Coverage of Examination Papers
Examinations play an important role in the learning process and the whole education system. It is a way of assessing what the students have learned concerning particular subjects. The strengths and weaknesses of the students can be assessed through the examination. Therefore, examination papers should have a standard and quality attributes. The question preparation process is a very challenging step for academics. Sometimes, academics fail to prepare the question papers to address all the points in the syllabus. It is a critical problem in education when it fails to cover all the learning outcomes. Currently, there is no automated system to check the depth of the examination papers. Checking the depth manually is a very difficult and time-consuming task. In this research, the ontology-based approach is proposed to determine the depth of the examination papers. First, the ontology for a particular subject was created using Protégé software. Then, the soft copies of the examination papers were collected relevant to the subject. Then the stop words of the examination papers were removed. The created ontology was read by Java and Jena library. After that, a list of the subclasses of the ontology had been retrieved. Next, the path of the word was included in the ontology was discovered. In parallel, the number of all concepts in the ontology was counted. Using the paths of the words, the depth of the examination papers was evaluated. The average of 84.81% is shown by the ontology based approach and 79.99% was provided by the experts as the coverage for 5 subjects we analyzed. The empirical study of our prototyping system has proved the effectiveness of our proposed ontology-based method.
16:10 Comparison between Support Vector Machine and Random Forest for Hepatocellular Carcinoma (HCC) Classification
Velery Virgina Putri Wibowo and Zuherman Rustam (University of Indonesia, Indonesia); Sri Hartini (Universitas Indonesia, Indonesia); Qisthina Syifa Setiawan and Jane Eva Aurelia (University of Indonesia, Indonesia)
Hepatocellular Carcinoma (HCC) is a type of liver cancer which occurs when a tumor grows malignantly in the liver. This cancer starts from the liver and is not caused by the spread of cancer from other organs. HCC commonly occurs due to the complications of liver disease. However, most patients do not show signs and symptoms in the early stage of liver cancer. Therefore, classification with high accuracy is needed to predict individuals with HCC early, based on their data and to provide them with the best treatment. In this study, the data used consisted of 192 samples with 66 HCC and 126 non-HCC samples, which were obtained from Al Islam Bandung Hospital. Many machine learning methods have been used to carry out classification. Among these methods, Support Vector Machine (SVM) and Random Forest (RF) have been frequently used due to their high level of performance. Therefore, in this study, SVM and RF were compared and analyzed for the classification of HCC. The aim of this study was to discover which method has the best accuracy to classify HCC. The results showed that SVM and RF had the highest accuracy value at 90% and 100% respectively. Therefore, RF method is a better model compared to SVM and suggested to be used in the classification of HCC.
16:30 Comparing Decision Tree and Logistic Regression for Pancreatic Cancer Classification
Qisthina Syifa Setiawan and Zuherman Rustam (University of Indonesia, Indonesia); Sri Hartini (Universitas Indonesia, Indonesia); Velery Virgina Putri Wibowo and Jane Eva Aurelia (University of Indonesia, Indonesia)
The kind of disease which causes the development of abnormal cells in any part of the body and also leads to death is called cancer. Pancreatic cancer is a type of cancer which is marked when abnormal cells start to develop in the pancreas. Sometimes, the affected individuals do not show any signs or symptoms at an early stage. There are treatments which are chosen based on how wide it has spread with the aim of extending the lives of those affected. Therefore, classification algorithms of machine learning with the best accuracy are needed to assist the medical field in classifying individuals with pancreatic cancer. In this research, classification algorithms of Decision Tree and Logistic Regression were used. Furthermore, these two methods were compared to discover which has the best performance based on accuracy. The results showed that the Decision Tree and Logistic Regression yielded 100% and 92.68% respectively as their highest accuracy. Therefore, the Decision Tree is a better method based on accuracy for classifying pancreatic cancer.
S4D: Energy Management Decisions 2
15:30 Multi-Area Dynamic Dispatch Mathematical Formulation Incorporating PEVs/BEVs and Renewable Energy Sources
Challa Leela Kumari (Lovely Professional University, India); Vikram Kamboj (Lovely Professional University, Jalandhar, Punjab, India); S. k. Bath (GZSCCET Maharaja Ranjit Singh Technical University, India)
Multi-area Dynamic load dispatch problem is a vital issue in power system scheduling, processing, organizing, and managing. This issue in the power system is explored with the combination of electric utilities of various different regions. The mathematical formulation of multi-area dynamic dispatch problems utilizing plug-in electric vehicles (PEVs), battery electric vehicles (BEVs), and renewable energy sources have been explained in this paper. This work of mathematical formulation will be useful for the research work on multi-region economic load dispatch problems with electric vehicles (EVs) and Renewable Energy Sources (RES).
15:50 Cuckoo Search Optimization based MPPT for Integrated DFIG-Wind Energy System
Srikanth Goud B (Koneru Lakshmaiah Education Foundation & Anurag College of Engineering, India)
The wind is a new novel model introduced to satisfy the global demand for energy required by several utilities. Mainly the performance characteristics of an integrated grid depend on controlling techniques employed. In this proposed paper the grid is supplied by the Wind Power which is the first converted to DC using DC-DC converter then fed to DC link bus and then converted into AC using an inverter. The DC-DC converter is employed with the control techniques like Cuckoo search optimization (CSO) MPPT to generate the duty pulses required to boost up the voltage at the converter and comparative analysis are observed with existing methods like Perturb and Observe (P&O) and Particle Swarm Optimization (PSO). The proposed paper is implemented in MATLAB/Simulink platform
16:10 Modelling for Optimal Load Dispatch of Integrated Renewable Energy Source/BESS/Electric Vehicle Charging Station
To enhance the utilization of renewable energy resources, to fulfill the increased demand of energy and to reduce global warming and degradation of ecosystem, the economic load dispatch of an integrated system considering renewable energy source (RES), electric vehicle load/ charging station (EVL) and battery energy storage system (BESS) has been planned across the world. The main aim of economic load dispatch in power system operation is to fulfill the energy load demand at the most economic cost while achieving all the equality and inequality constraints. This paper presents the mathematical formulation of optimal load dispatch problem by considering the sources of energy generation from conventional power plants and renewable energy sources with electric vehicles (plug-in, PEVs and battery, BEVs). The power from the standby electric vehicles can be used as a reserve for ancillary services that can be utilized as spinning reserve.
16:30 Enhancing of the operational decisions in Electric Power Systems under blackouts
The operation of electric power systems is a complex task that includes the control and supervision of a great number of elements. A failure in any of them could generate system disturbances and, if they get worse, blackouts can be produced. In this regard, this works presents a model that helps with the decisions in the field of efficiency of these systems. The model analyzes the line status considering the combination of factors that are not frequently addressed in the literature. It allows enhancing the operation decisions by proposing alternative schedules that reduce the possibility of the systems to suffer failures. To prove the effectiveness of the model, a real case is studied: the great blackout of the Argentine System in 2019. Results show alternative solutions, which would have been useful to operate the system during this event, by reducing the power flows in risk lines with only 0.05% of the cost increase.
S4E: Decision Making Using IoT and ML 4
15:30 Decision-making support system for fruit diseases classification using Deep Learning
Eduardo Assunção and Catarina Diniz (University of Beira Interior, Portugal); Pedro D Gaspar (University of Beira Interior & C-MAST – Center Mechanical and Aerospace Science and Technologies, Portugal); Hugo Proença (University of Beira Interior & IT-Instituto de Telecomunicações, Portugal)
Fruit diseases are a continuous hazard to farmers. By applying computer vision-based techniques, precision agriculture can support the farmers in the decision making for fruit disease control. Features extraction is an essential task for the computer vision pipeline. Nowadays, in general, feature extraction for fruit diseases are handcrafted. However, empirical results in different domains confirm that features learned by Convolutional neural networks (CNNs) provide significant improvements in accuracy over handcrafted features. CNNs have been applied in many computer vision tasks, replacing the hand-engineered models. In general, a large-scale image dataset is necessary for training a CNN. However, there are not many fruit disease images available to compose the dataset. We propose to train a tiny and efficient deep convolutional network developed to run in the mobile devices to classify healthy peach fruits and three peach diseases. Based on transfer learning techniques and data augmentation strategies, the proposed model achieves a Macroaverage F1-score of 0.96. The model does not misclassify any disease class. This achievement shows the potential of using small CNN models for fruit disease classification when having a small quantity of training data.
15:50 Promoting Patient Safety through Machine Learning
Ragheb H Al Nammari (Khalifa University, United Arab Emirates)
There are massive amounts of medical data being collected by healthcare facilities, which offer an opportunity to use them for the improvement of patient safety. The application of Machine Learning tools on such data can lead to a greater understanding of the patterns as well as increased accuracy in diagnosing patients. Moreover, wearable devices are used to collect patients’ data such as heart rate, skin temperature, and step count to provide insight on patient’s health. However, using machine learning in areas such as healthcare may be hindered due to some challenges. To shed light on such applications, this paper reviews some of the Machine Learning applications in a patient safety context. Additionally, this paper discusses the adoption of machine learning predictive algorithms and wearable devices as well as social and legal challenges associated with this adoption.
16:10 Machine Learning for Strategic Decision Making during COVID-19 at Higher Education Institutes
Machine learning is becoming driving force for strategic decision making in higher educational institutions and it calls for cooperation between stakeholders and the use of efficient computation methods. Contrariwise, making decisions might consume much time, if there is no use of data and computational methods during the process of decision making.The utilization of machine learning is essential when coming up with an ultimate analysis of data and decision making. Besides, the technology which is under artificial intelligence could facilitates incredible output for educational institutes when it came to decision making. This paper analyses the output generated using machine learning algorithms that help in prediction of no detriment policy applicability rate in the case of e-learning during COVID-19. The study investigates the performance of machine learning algorithms for strategic decision making in the higher educational institutes, Global College of Engineering and Technology in particular, whether no detriment policy will be applicable for a particular student based on students performance before COVID-19. The study shown that Random Forest machine learning algorithm performance is higher as compare to Support Vector Machine, Decision Tree and Navie Bayes.
16:30 Combined multi-layered big data and responsible AI techniques for enhanced decision support in Shipping
The Shipping Industry has recently begun to experience a rapid change in the way data from ships is collected and processed. Satellite communications, telemetry, data collection and data analytics are some of the contemporary technologies employed that enable rapid and efficient data acquisition and processing, making fleet-wise remote monitoring possible. The key technological challenges to progress to the Internet of Ships framework are the demands on infrastructure and humans as well as on advanced analytics, Deep learning techniques. The main goal of our work is to introduce an innovative platform to harmonize, through big data technologies, data collected from various sensors onboard and to implement extreme scale processing techniques, in order to perform Operational efficiency and Performance optimization. The platform is further benchmarked on a series of pilot demonstrations regarding Fuel Oil Consumption prediction.
S4F: Management Decision 4
15:30 Transforming Business Decision Making with Internet of Things (IoT) and Machine Learning (ML)
Disruptive technologies like Internet of Things, Machine learning, and Artificial Intelligence is gaining acceptance among businesses and individual as they realize its potential for driving business opportunities and revenue growth. Business owners really want to exploit advantages of these technologies and get essential inferences on their data, but they are not equipped with the right tools and people to leverage it. This paper is dedicated to enumerating and to present the recent challenges in adoption of these disruptive technologies in business processes. We have presented an architectural model of IoT & ML based application that can be integrated to various other enterprise applications for enabling real time data analytics, visibility and decision making. Few specific business applications of IoT & ML based applications in industries like manufacturing, pharmaceutical, and hospitality have been discussed briefly. Finally, we have listed out the major challenges and probable solutions.
15:50 Adoption of E-business: A Systematic Literature Review
Technology influences us in many ways, including business. Traditional business surely must adapt with technology digitalization that has capability to alter several elements in business, such as the process, model, organizational structure as well as customers and business partners relationship. Therefore, the research on the effect of Adoption of E-Business is considered as important, particularly for business actors. Aside of benefits, e-business also entails obstacles. This research is objected to analyse the studies about the effect of e-business and to provide suggestions for future research. This research employs systematic literature review method by involving 31 appropriate literatures. They are analysed comprehensively and systematically based on review protocol. This research suggests the options for future research to provide the comprehensive and accurate analysis result towards the effect of e-business adoption.
16:10 Escalation commitment in decision making and its possible effects in the long run
Mohammad Selim (UOB, Bahrain)
The objective of this study is to investigate the causes of the escalation commitment type of faulty decision making and evaluates its destructive effects on the firms, business, state, governments and for all other decision makers. The model is developed step by step and analyzed purely on theoretical foundations. The model identifies certain factors that may lead to escalation commitment in decision making. Even though the escalation commitment is faulty decision, many decision makers including business leaders, head of the states and other decision makers have one way or other adopted escalation commitment in decision making and created disasters for their organizations as well as for the world. The Two World Wars, Vietnam War, Iraq war are some of the examples where the actors and decision makers adopted escalation commitment and employed more and resources, soldiers and thus only created deaths and destructions and such destructions are so massive that minor gains, if any, do not justify such escalation commitment. The results and findings of this study show that sunk cost, absolute power of the leaders, non-accountability, blind support, hiding own faults, personal involvement are some of the major factors that may lead to such escalation commitment while rational thinking in decision making process can put a halt to such faulty decision making where enormous resources and the lives of the millions of the people can be saved if the decision makers apply rational thinking prior to any decision making process. This study will serve as eye opening to the world leaders and to major decision makers in busines and key policy makers to prevent escalation commitment by forcing the decision makers to rational thinking, accountability and by curbing their absolute powers and placing other restrictive measures prior to commit any further resources, manpower and waging any unnecessary wars.
16:30 Consumer Vehicle Purchase Decision-making during COVID-19
This paper examines the impact of Covid-19 on consumer’s decision-making while purchasing a vehicle. Consumers are faced with several sets of criteria to select the right vehicle during the global pandemic of COVID-19. Health and safety have become a paramount concern and simultaneously consumer purchasing power has changed given economic conditions. We focus on consumer attitudes and preferences while purchasing a vehicle. Vehicle selection is considered to be a multiple criteria decision-making problem. The Analytical Hierarchy Process (AHP) approach is used to determine consumer preferences towards vehicle selection using the following criteria: financial aspects, maintenance aspects, vehicle features, and promotions offered. The objective of this research is to help understand what alternative is best for the consumer during COVID-19 given the preselected criteria. Meanwhile, the paper sheds light to the dealerships in the Kingdom of Bahrain with regards to the offers, vehicle features, and after-sales service that could be an offer to potential buyers during COVID-19.
S4G: Decision Making Using IoT and ML 5
15:30 Smart Agent Edge Microservices Deployment Approach
Badr El Khalyly (Hassan II University, Morocco); Mouad Banane (University Hassan II, Morocco); Allae Erraissi (Hassan II University & Faculty of Sciences, Morocco); Abdessamad Belangour (Hassan II University, Morocco)
Smart agents are essential elements in an Internet of Things ecosystem. Smart agents are made up of sensors and actuators linked to a micro-controller. These smart agents are programmed through applications that are deployed at the microcontroller level. These programs can be modified or replaced or deleted. Also, the components of these smart agents may change, ie it may be that we add or remove a sensor and actuator. These smart agents must continue to operate without having to stop them and deploy new programs. In this case, we need tools that ensure deployment and continuous integration. Devops offers a continuous deployment chain. We offer an automation solution for the deployment and continuous integration of Edge microservice in microcontrollers. These systems are based on microservices. The microservices deployed are in the form of a Docker container. This solution allows the Internet of Things developer to install, modify, and delete containers at the microcontroller level and to control microcontrollers remotely while maintaining the continuity of operation of the microcontrollers.
15:50 An overview of Intrusion Detection Based on Deep Learning Techniques
One of the main challenge in computer networks has become to keep up with patterns of threats that evolve and increase on a daily basis. There are many traditional mechanisms, such as firewalls, but they do not secure the detection of new types of attacks. Intrusion detection systems are tools to detect attacks, but they suffer from an inability to detect unknown attacks. Therefore, the method was devoted to methods of machine learning and data mining to increase the capability to predict new types of attacks. This study reviewed and analysed the research background for Intrusion Detection Systems (IDSs) based on Deep Learning (DL) or Machine Learning (ML) methods into a logical taxonomy and pinpoints the challenges and future opportunities in this vital study area. There are several techniques to aid IDS to identify and identified the changing behaviour of the system. However, some papers have lately proposed the idea of hybrid detection. This study analyses machine-learning techniques in IDS. Many related studies focused on machine learning techniques had been reviews in the period below 2000 to 2020. Associated studies include machine learning, deep learning, and Hybrid approaches.
16:10 IoT based Smart Digital Electric Meter for Home Appliances
Automated meter reading (AMR) systems are used to effectively help utilize resources which are dependent on electricity in the home. They find numerous areas of application such as smart home, radio frequency networks, and touch-based AMR. Electrical energy consumption is proportional to the level of usage of electricity, thus, the higher the usage of electricity, the higher the billing cost. The uncontrolled daily electricity consumption causes an increase in the monthly electricity bill to normally exceed the household budget. To address this challenge, we therefore propose a low cost system with an improved architecture to that of existing electric meters as a tool to effectively monitoring electricity consumption of household equipment. The system lets the user monitor his/her appliances from a distant location via the internet. Some advantages of the system are mobility and convenience since users requires internet in order to monitor the equipment.
16:30 Improving the Performance of Multinomial Logistic Regression in Vowel Recognition by Determining Best Regression Coefficients
The performance of Multinomial Logistic Regression (MLR) is highly dependent on the estimated value of its parameters (Regression Coefficients – RCs). However, the usual maximum likelihood estimation (MLE) approach of RCs mostly resulted in overfitting the regression model, especially in limited data. Hence alternatives approach (shrinkage) such as Lasso and ridge were proposed. The shrinkage process at times might eliminate important predictors by shrinking the RCs values to zero. We proposed data splitting and swapping approach aimed at eliminating the identified problems in the existing estimation approaches while improving the performance of MLR. Two algorithms were implemented for determining the best set of RCs (DBRCs) which are DBRCs-I and DBRCs-II. Experimental results show that one of the approach- DBRCs-II outperforms the conventional MLE, approach by 2.05 % in overall recognition of Malay vowels. Given enough data for training, DBRCs-I swapping techniques can be use as good technique to obtain good RCs faster.
S4H: Sustainable Decisions for a Sustainable Development 4
15:30 The Intuitionistic Fuzzy Set FlowSort methodology for green supplier Evaluation
Evaluating supplier seeing environmental and economic effects considered as an important classification multi-criteria decision-making (MCDM) problems. It consists to sort green suppliers into ordered and predefined groups. It is recognized as an uncertain and ambiguous issue. In view of this, the current work tries to develop a model which integrates the IFS-FlowSort method for solving green supplier evaluation categorization problem. The IFS-FlowSort method is an extension of the classification FlowSort method which deals to the fuzziness by using the Intuitionistic fuzzy set theory. To validate our model an application of a use case study which evaluates the green pharmaceutical industries is treated.
15:50 Land Surface Temperature Modelling over Geoclimatic Regions of Nigeria using Soft-computing Intelligence Techniques
This study used minimum (TN), maximum (TX), and mean (TM) land surface temperature data obtained from the Nigerian Meteorological Agency, Oshodi, Nigeria covering the period of thirty years (1984 – 2013) over twenty meteorological stations in Nigeria. These twenty stations were further divided into four climatic regions of Nigeria. Two soft-computing intelligence techniques – the multilayer perceptron (MLP) and radial basis function (RBF) neural networks – were employed to predict these three land surface temperatures (LST) series. The networks were created with twenty past years of the LST series as inputs to predict the next ten years (2003 – 2013) of the LST series using 60% of the data for training, 20% for validation, and 20% for testing the networks. The performance of outputs (the predicted ten years LST series) was evaluated using the standard statistics metrics. Analyses of the coefficient of efficiency (NSE) showed that MLP and RBF performed best for prediction of TN in the Sahel region having NSE-values of 0.987 and 0.967, TX in the Guinea Savannah region having NSE-values of 0.973 and 0.966, TM in the Derived Savannah region having NSE-values of 0.995 and 0.922 and TX in the Coastal region having NSE-values of 1.00 and 0.520 respectively. The error analysis using root mean square errors and coefficient of uncertainty also showed that both networks have acceptable minimal values but the MLP network has lower values than those of the RBF network. Therefore, it can be concluded that soft-computing intelligence techniques especially multilayer perceptron networks are suitable for the prediction of land surface temperature series over Nigeria for practical purposes.
16:10 Innovative Technologies for the Creation of a New Sustainable, Environmentally Neutral Energy Production in Ukraine
In Ukraine, there are currently contradictions between the growing demand for energy resources, limited reserves of traditional fuels and the constantly rising prices of them. At the same time, the state has a significant raw material potential for the development of renewable energy in the form of biomass, wind and solar energy. This paper shows the possibility for large-scale energy production from renewable sources, both for domestic use and for export to other countries. Thus, the use of innovative technologies in renewable energy production can contribute to the creation of a new sustainable, environmentally neutral model of Ukraine’s economy. The innovative technologies – biomass torrefication, processing the agricultural waste into biogas, production of green hydrogen – can contribute to the creation of a new sustainable energy production model in Ukraine. However, to achieve these goals it is important to create a new infrastructure that will reduce costs within the supply chain of renewable raw materials in the long run and expand the possibilities of their different uses.
16:30 Design Key Performance Indicator for Distribution Sustainable Supply Chain Management
Organic farming is providing agricultural products which are healthy and environmentally friendly. Organic Farmers’ supply chain involves suppliers, growers and distributors. To improve performance, it is necessary to design sustainable performance measurements at distributors. The key performance indicator (KPI) design processes are KPI identification, KPI elimination and KPI validation. Then, to find out the significant weight, KPI was analyzed using the Analytical Hierarchy Process (AHP) method. From the research results, there are 38 indicators used in this KPI. Based on AHP weighting, the obtained data were three aspects including the economic (0.504), social (0.2815) and environmental (0.2145). In the next research, the KPI obtained can be used as a measurement of distribution performance.
S4I: Decision Models in Marketing 2
15:30 Analysing the effect of electronic service quality and satisfaction on apparel purchase propinquity decision
Study empirically examines the interrelationship among service quality, customer satisfaction and behavioral intention for online apparel purchase. Service quality was measured through website design, reliability, responsiveness, trust & Personalisation. Revisit to website and recommendation to others are used to measure Behavioral intention. Study also examines the mediating role of satisfaction in service quality and Behavioral intention for 371 Indian online customers. The hypothesized relationships were tested through multiple regression and the mediating effect was analyzed through hierarchical regression. The findings of the study support that e-service quality is a predecessor of customer satisfaction and satisfaction performs a mediating role between service quality and Behavioral intention. Perceived service quality directly affects Behavioral intention, inferring that the impact of service quality on Behavioral intention is as essential as that of satisfaction. Findings of study will enable the online managers to remain more competitive in the industry, as it will help them to understand customer perception and expectation with online apparel shopping.
15:50 Adoption of FinTech and Future Perspective: An Empirical Evidence from Bahrain on Digital Wallets
The current study aims to examine the factors affecting the acceptance of the Digital wallets in Kingdom of Bahrain. A multi-item scale is designed to measure the different risk and benefit factors. A sample data of 392 digital wallets users were obtained from Bahrain through online survey to empirically validate the constructs using Confirmatory Factor Analysis (CFA). The Structural Equation Model (SEM) is used to assess the structural model and test the relevant hypotheses to examine the continuous intention to use digital wallets in Bahrain.
16:10 Agility decision-making model in digital enterprise markets driven by turbulence of cultural cognition and technological Innovation
Agile Methodology is becoming very popular and proven methodology of success in software project management. The concept has recently started to be viewed by non-software project managers and enterprise decision makers as a mechanism of success to other types of industry. Marketing agility is of no exception. In an innovative industry, products that meets customer satisfaction must be delivered as anticipated by the fast-changing customer requirements. Marketing agility is inevitable in this case. After identifying and reviewing the need of marketing agility in decision making, this paper proposes a decision making model based on multiple criteria. These criteria can be split into three main categories: The “cultural cognition” of analysts within the enterprise, the “cultural cognition” issue of potential customers anticipating the new product, and the technological competitiveness among highly turbulent technological environment. The purpose of this “work-in-progress” study is to identify the main driving elements in such industrial sector that necessitate the need of agile marketing projects model for effective decision making in technological turbulent environment.
16:30 Business Analytics of E-Commerce Policy and Practice: An Ethical Perspective
Rania Aburaya (University of Bahrain, Bahrain)
E-commerce has become an important technological business phenomenon all over the world. This paper aims at investigating e-commerce policy perspectives and business analytics of e-commerce practice statistics. In doing so, it highlights the importance of promoting ethical values in preventing illegal and unauthorized e-commerce practices. The study shows that e-commerce is more than just the purchase and sale of products and services over the internet, but rather a much broader concept of transformations in the way of doing business and communicating with stakeholders. A sound ecommerce strategy should manage the integration between business strategy and information technology strategy. Business analytics of e-commerce worldwide practices statistics and trends reveal that global e-commerce is no longer a choice but rather an inevitable business activity adopting online strategies. The study has important implications for both businesses and stakeholders necessitating the need for new and innovative e-commerce infrastructure.
Plenary 1: Plenary Session
INFORMSBH: INFORMS-BH Group meeting
Monday, November 9
Plenary 2: Plenary Session
S5A: Decision Aid in Logistics and Engineering 6
10:10 Influence of External Workforce Diversity Factors on Labor Productivity in Construction Projects: Empirical Evidence from Pakistan
Ahsen Maqsoom and Khawar Khan (COMSATS University Islamabad Wah Campus, Pakistan); Muhammad Ali Musarat (Universiti Teknologi PETRONAS, Malaysia); Hasnain Mubasit and Iram Shaheen (COMSATS University Islamabad Wah Campus, Pakistan)
Workforce diversity has grown into an imperative problem throughout the world as it is deeply connected to organizational performance. This indicates that the organization should be enhanced and competitive in the current global environment. To enter the international arena, workforce diversity is among fundamental solutions. The organization must utilize or spend essential reserves on workforce diversity to generate chances for optimum employee productivity. Various workforce diversity factors influence employee productivity, thus influencing the organizational performance and the built environment. This research aims to analyze the location, physical ability, ethnicity, and labor union related workforce diversity factors influencing labor productivity. Data was collected through on-site surveys by getting survey questionnaire filled by 131 management professionals employed at different project sites. Based on average MIR values, location and physical ability factors were characterized as most critical among the workforce diversity factors affecting the labor productivity. Also, it was identified that the labor productivity is highly reliant on the location of the workplace, the environment provided, the physical ability of a worker, cultural norms and lockdown effects particularly. This research involves the participants from the construction industry of Pakistan; the potential research can be extended to further service sectors and growing economies.
10:30 Artificial Intelligence As Decision Aid In Humanitarian Response
Artificial Intelligence (AI), Remote Sensing (RS), and Big Data have impacted on society beneficial. AI is able to take over repetitive and dangerous. The usage of AI, RS, and Big Data is quite promising because it enhances capabilities for problem-solving mechanisms. The research investigates the rise of AI, RS, and Big Data in the context of humanitarian aids and insurgency. The paper includes background theories of AI, RS, and Big Data. While case studies of the usage of these technologies are explored and discussed.
10:50 A Branch-and-Bound Algorithm For The Problem of Scheduling With a Conflict Graph
The problem of scheduling n unit-time jobs on m uniform machines is raised in this paper. The jobs are subject to conflicting constraints modeled by a graph G, called the conflict graph. Adjacent jobs (conflicting jobs) in G are not allowed to be processed on a same machine. The addressed objective is the minimization of the makespan of the schedule, which is known to be strongly NP-hard in the literature. The contribution of the current paper is an unorthodox branch-and-bound algorithm that proved to be highly efficient through empirical testing.
11:10 Attitude Regulation of Spacecraft using Large Angle Eigen-axis Rotations
The regulation of satellite orientation in an orbit is an essential task for the success of a space mission. To reorient the attitude of a satellite from an initial point to the desired attitude, efficient control law needs to be developed. In this paper, the attitude control law is proposed to achieve a necessary shortest path of the Eigen-axis rotation between two attitudes. Moreover, it is also shown that the optimization of input energy required for the optimal Eigen-axis rotation can also be done through the proposed approach. The stability analysis of the proposed methodology is proved using the Lyapunov theory. Furthermore, the proposed scheme is also validated through numerical simulation.
S5B: Medical Decision Making 5
10:10 Technology Applications for Health Safety Decision Making under COVID-19 Pandemic Management
A lot of new technological applications are emerging to combat the deadly novel coronavirus. This research study gives an overview of different applications that are developed by Government Institutions, Private Firms, and Individual Citizens across the world. The applications are reviewed based on their widespread use, effectiveness, availability to the broader audience, cost to implement it, concerns regarding the privacy and information collected by these apps and systems. The major eight areas of technology applications covered in this study are Contact Tracing, Social Distancing & Mask Detection, Live-feeds based Dashboards, Information Searching, Big Data and Robotics, Web-based Disease Surveillance Tools, Patient-level Information, Doctor-Patient interaction, and Informatory Chatbots. More than 100 apps were collected for this research survey to conclude the different categories in which technology is being used for decision making. This study will be useful for various health administrators, professionals, researchers, and academicians.
10:30 New Combining Rules for Spatial Clustering Methods Using Sigma-Count for Spatial Epidemiology
In Epidemiology, the study of several diseases is related to the geographical territory in which it occurs. Some spatial clustering methods are able to perform statistical decision about the significance of territories. However, each method is based on a different methodology, providing different results as well. Recently, some authors proposed combining spatial clustering methods in order to provide more accurate results. This paper propose two new combining rules based on Cardinality of Fuzzy Sets, generalizing the classical Majority Voting and Plurality Voting. A study of case using real Dengue Fever epidemiological data and combining five spatial clustering methods was performed. In this study the Fuzzy Plurality Voting provided better decision map than the classical ones, when compared to a reference map.
10:50 Cancer Literature Classification Methods Performance
The literary classification system is the best solution to improve the data search process. In terms of the need, its goal is to compare the relevant biomedical papers and discover novel knowledge to identify potential research issues. This paper will present cancer literature classification performance by comparing three approaches, Naïve Bayes, Neural Network and Linear Classifier with SGD training. The propose approaches classify biomedical literature in five classes of cancer literature type namely, bone cancer, gastric cancer, kidney cancer, skin cancer and papillary thyroid cancer by using 9259 documents. General steps for building classification refer to the classification of scientific literature. The result shows that all algorithms successfully can be used to classify cancer literature. However, for the best performance, it is strongly recommended to use Naïve Bayes and Neural Network.
11:10 Incorporating the decision maker’s preferences in Dietary Menu Planning problem
A Dynamic Goal Programming Diet Menu Plan (DGPDMP) model for patients undergoing Hemodialysis has previously been formulated and validated . The problem was validated and the obtained menus guarantees all the patient nutritional requirements. However, it is recommended that the decision maker (DM)(patient in accordance with the dietician guidelines) can deal with a model that consist of multiple and conflicting goals to be simultaneously pursued even if the objectives have different importance and can communicate the objectives’ relative importance by determining a level of satisfaction in accordance to deviations associated to nutritional goals. The development of the Weighted DGPDMP model (WDGPDMP) is hereby presented in this work and we explicitly introduce the structure of the DM’s preferences into the developed model.
S5C: Computerized Decision Aid 5
10:10 Comparison between Convolutional Neural Network and Convolutional Neural Network-Support Vector Machines as the classifier for Colon Cancer
Colon Cancer begins in the rectum, and it grows in the last part of the large intestine. In the early stages, there are no symptoms, and it can be identified by the machine learning method. Convolutional Neural Network is a popular method used in machine learning in a wide range of application domains that is known for its high accuracy value. In addition, there is a Support Vector Machine method with several kernel functions that has been applied in the classification. Therefore, the research is aimed at the performance and accuracy of Convolutional Neural Network, and Convolutional Neural Network-Support Vector Machine as the classification of colon cancer.
10:30 Hyperparameter Optimization on Support Vector Machine using Grid Search for Classifying Thalassemia Data
Thalassemia is a genetically inherited disorder that causes fewer hemoglobin and red blood cells in the body. Although thalassemia is not contagious, it cannot be cured and some types require a lifetime blood transfusion. However, the occurrence of thalassemia is prevented by increasing public knowledge and awareness followed by early detection with screening process. The purpose of this screening is to identify a carrier that does not appear to have symptoms through a number of examination procedures in the target population. The data used in this study were obtained from the Harapan Kita Children and Women’s Hospital, Jakarta, Indonesia which consisted of 150 samples and 11 features. The Support Vector Machine (SVM) with hyperparameter optimization using Grid Search was proposed in this study to classify thalassemia data. Grid Search was used to optimize the C and gamma parameters on SVM with RBF kernel. The results of this study showed that the proposed method produced 100% accuracy with 90% training data when the parameter value C = 428.13 and gamma = 0.0000183 using holdout validation. Furthermore, it produced 100% accuracy when C = 4832.93 and gamma = 0.0000183 using 10-fold cross validation. These results were a lot better compared to using the same RBF kernel on SVM but with default parameter values when C = 1 and gamma = scale, which only produced 73.33% and 57.14% accuracy with holdout and 10-fold cross validation respectively.
10:50 TextMage: The Automated Bangla Caption Generator Based On Deep Learning
Neural Networks and Deep Learning have seen an upsurge of research in the past decade due to the improved results. Generates text from the given image is a crucial task that requires the combination of both sectors which are computer vision and natural language processing in order to understand an image and represent it using a natural language. However existing works have all been done on a particular lingual domain and on the same set of data. This leads to the systems being developed to perform poorly on images that belong to specific locales’ geographical context. TextMage is a system that is capable of understanding visual scenes that belong to the Bangladeshi geographical context and use its knowledge to represent what it understands in Bengali. Hence, we have trained a model on our previously developed and published dataset named BanglaLekhaImageCaptions. This dataset contains 9,154 images along with two annotations for each image. In order to access performance, the proposed model has been implemented and evaluated.
11:10 Linear Support Vector Machine and Logistic Regression for Cerebral Infarction Classification
Stroke, as one of Global Burden Disease (GBD), obstructing the flow of blood to the brain and neurologic devastation, comprises of two types, namely hemorrhagic and ischemic with approximately 87% of all strokes classified as ischemic due to cerebral infarction or the occlusion of a cerebral vessel. Therefore, early identification is needed to enable patients to obtain the right treatment and prevent chronic cerebral infarction. This research proposes the use of a machine learning algorithm for appropriate and early diagnosis of patients with cerebral infarction by comparing linear function kernel of Support Vector Machine (SVM) and logistic regression methods. The main advantage of this method is its ability to determine the best linear classifier between these two methods for cerebral infarction classification in four criteria, namely accuracy, precision, recall, and F1 score. The highest average accuracy and F1 score were used to determine the best classifier. The result showed that the linear function kernel of support vector machine is the best for cerebral infarction classification with 90.96% and 91.44% average of accuracy and F1 score, respectively. In conclusion, future studies need to be carried out to improve machine learning classification for medical diagnosis using a linear classifier.
S5D: Energy Management Decisions 3
10:10 Electricity Power Demand Comparative Analysis of District Cooling and Conventional Cooling Systems – Case Study in Diyar Al Muharraq (DAM) Development Project
Bahrain one of Arabian Gulf countries is located in warm to harshly hot climate throughout most of the year with short moderate winter. The cooling demand does consume around 60% of the electricity during peak summer and with the on-going global warming past few years have consistently registered new heat records. There are various new cities being reclaimed on the sea to cope with local population demands, of which new infrastructure including electricity is needed. Bahrain is already having total electricity capacity of 4,025 MW, which is very large compared to the country size. The government would be facing requirement to increase or connect consistent high electricity loads to new developments if the conventional air cooling (AC) continuous, including to this case-study in Diyar Al Muharraq (DAM) one of the largest reclaimed on-going new development projects stretching over 13 km 2 . However, it is proven through previous studies that 30% or more of electricity consumption can be saved by using District cooling (DC) systems depending on multiple parameters including targeted customers location, types and demand and cooling demand pattern, efficiency of conventional AC systems being implemented in concerned area. This present work done to investigate potential usage of DC in DAM project for reducing electricity demand that is one of the strategic challenges in local and regional levels considering the generation is mostly from petroleum sources primarily by burning natural gas, which have negative impact towards environment. The conclusion is that with current local conditions, and above stated parameters, the DC does provide around 31-36% electricity reduction.
10:30 A Proposed Lean Distribution System for Solar Power Plants Using Mathematical Modeling and Simulation Technique
Mohsen Momenitabar and Zhila Dehdari Ebrahimi (North Dakota State University, USA); Seyed Hassan Hosseini (Sapienza University of Rome, Italy); Mohammad Arani (University of Arkansas at Little Rock, USA)
Today’s power waste is one of the most crucial problems which power stations across the world are facing. In this paper, we apply two approaches to develop the solar power plants relevant to the construction. First, we propose mathematical modeling to reduce the cost of production, and then the simulation technique applies to electricity transmission distribution systems. Furthermore, we consider two criteria for comparison including the different cost of the system and the rate of energy waste during the transmission. The primary approach is to use both of the models in order to draw a comparison between simulation results and mathematical models. Finally, the analysis of the test results done by the CPLEX toolbox of MATLAB Software 2019 leads to a remarkable decrease in the costs of energy demand in the electricity transmitting network distribution system.
10:50 Effects of Chevrons on the Acoustic Noise and Velocity Patterns of Aircraft Nozzles
Aircraft engines are a major contributor to the noise level these days. One way of decreasing these levels is by using different shaped nozzles. The main aim of this research is to investigate the noise levels around an aircraft nozzle by including the chevron nozzle. Another aim of the research is to determine whether the changes in noise levels are linked to the velocities produced by the nozzles. The simulation involves analyzing two nozzles, one with V-shaped chevrons and the other with no chevrons. ANSYS CFD Fluent is used as means to simulate the conditions around a nozzle. The results obtained show that the V-shaped chevrons nozzle produced the lesser noise and the greater average velocity. Keeping the noise levels intact not only benefits the engine in a way but also keeps the noise pollution levels to a minimum.
11:10 A Fuzzy Goal Programming Model for Selecting the Best Sustainable Renewable Energy Source in Algeria
Mohammed Seghir Guellil (POLDEVA Laboratory, Algeria); Samir Ghouali (STIC Lab, Algeria); Mostéfa Belmokaddem (Faculty of Economics, Business and Management Sciences, Algeria); Samir Bettahar (Faculty of Economics and Management, Tlemcen, Algeria); Fayçal Mokhtari (MCLDL Laboratory, Faculty of Economics, Business and Management Sciences, Algeria)
Choosing Renewable Energy’s (RE) best source is an uncertain Multi-Criterion Decision-Making issue (MCDM). In particular, this involves the search for the best RE supply that fulfills DM’s preferences in accordance with and using contradictory parameters, such as technological, environmental, social and economic factors. This paper uses an effective approach called Fuzzy Goal Programming (FGP) to solve the complex problems of DM with high uncertainty. The paper helps address a wide variety of uncertainties in DM issues based on the traditional FGP model in real-world problems. This is a real-life scenario validated for selecting the best RE source to generate energy in Algeria. Seven parameters are used to determine five sources: solar, wind, geothermal, biomass and hydroelectricity. The results show that the approach used will enable DM to find the best sustainable source of RE in Algeria in unpredictable and inaccurate situations.
S5E: Decision Making Using IoT and ML 6
10:10 Automated Home Based Physiotherapy
Singarao Karthik (Sreyas Institute of Engineering and Technology, India); MVV Prasad Kantipudi (Sreyas Institute Of Engineering and Technology, India); John Moses C (Sreyas Institute of Engineeing and Technology, India); Sandeep Kumar (Sreyas Institute of Engineering and Technology, India); Kuchuru Mysura Reddy and Dileep Seshu B (Sreyas Institute Of Engineering and Technology, India)
Last few decades there is rapid technology development in the medical field. Automation plays an important role in our day to day life nowadays. Because of time constraints, no one wants to waste their time in unnecessary waiting at the hospitals and the development of home-based physiotherapy will be very much useful for such patients to deal with the minor problems. Automated home-based physiotherapy will make the patient more comfortable as he/she can do it at home without any extra medical fee. The role of physiotherapy is very much appreciable in the medical field as there is no need for medications and shows fewer side effects. Physiotherapy includes several methods one of the most famous methods is performing manual therapy (massaging), stretches and exercises, magnetic therapy, and Joint mobilization. Physiotherapy help one who is met with an accident and having fractures helps to reduce the pain, make the body stronger and flexible. This work examines the feasibility of the implementation of home-based physiotherapy by using Inertial Measurement Unit (IMU) regularization and wireless sensor networks
10:30 Pandemic Stabilizer
Ritu Dhull (Sreyas Institute of Engineering and Technology, India); Dheeraj Chava, Vineeth Kumar Deepala, MVV Prasad Kantipudi, Gaurav Samudrala and Vijay Bhargav M (Sreyas Institute Of Engineering and Technology, India)
An outburst of Covid-19, a new disease by coronavirus has been noted by December 2019 in China and subsequently this Covid-19 spread throughout the world. The serious effect of this disease causes death due to the failure of respiratory system of human. Nowadays the spreading of the coronavirus is uncontrollable especially in crowded places as well as due to avoiding the protocols given by world health organization (WHO) to reduce the spread of Covid -19. Many researchers have involved in identifying vaccine for giving proper treatment to the affected people and to avoid further spreading. In addition, with this some other researchers have also involved in using modern technologies to prevent the spreading of Covid-19 as well as to identify the disease in very earlier stage and to reduce the rate of death. One of the modern systems recommended by the researchers is that smartwatch. By using smartwatches, the user’s body temperature, heart rate and blood pressure can be measured, and the smartwatch can be automatically charged by body heat. This kind of smartwatches can be controlled by chatbot Google Assistant. This paper discusses the design, principle of operation and features of different smartwatches and their usage. Based on the analysis, it is identified that the availability of recent cost-effective developments of smartwatches are available on reducing spread of Covid-19.
10:50 Machine Learning for Malware Detection on Balanced and Imbalanced Datasets
There is a tremendous growth of malware with each passing day. It has become difficult to cope up with such an increasing number of malware, especially with new and unseen malware. It has posed a serious threat to software and the internet. Malware and machine learning is like a pair made in heaven. The malware contains various similar patterns due to the reuse of code while machine learning is used to detect those similarities. In this paper, two experiments are performed for balanced and imbalanced data on a previously build a dataset of malware detection on API calls using various machine learning classifiers like k-Nearest Neighbors, Gaussian Naive Bayes, Multi Naive Bayes, Decision Tree, and Random Forest. In both experiments, Random Forest provides the best results with an accuracy of 90.38% on a balanced dataset and 98.94% on an imbalanced dataset.
11:10 Real-time Contact Tracing During a Pandemic using Multi-camera Video Object Tracking
Due to the COVID19 pandemic, contact tracing and moving object tracking are gaining more popularity in automated video surveillance systems in computer vision and video processing. The application of contact tracing and moving object tracking is critical in applying pandemic control measures and is getting more important day by day. This work proposes a computer vision-based algorithm for contact tracing using stationary surveillance cameras. The input videos are converted into a bird’s eye view where all moving objects are detected, and the distances between them are calculated. The algorithm performs background subtraction to isolate foreground objects, morphological operations to remove the noise, and blob analysis to identify the connected regions in the resulting foreground video. Kalman filters to estimate objects’ motion in the video calculates Euclidean distance between the objects to trace object contacts. This algorithm can be utilized in almost all public places such as shopping malls, airport terminals, and educational institutions. It allows identifying, assessing, and managing people who might have been exposed to the disease. The testing data was collected in a home environment, and the stationary camera was replaced with a mobile phone camera fixed on a tripod. The work was implemented and tested, and the results verified the feasibility and effectiveness of the proposed method. The system was able to detect the objects in the input video frame and estimate the distance between them across multiple cameras.
S5F: Management Decision 5
10:10 Municipal Infrastructure Prioritization based on Consequence-Based Decision-Making Framework
The failure of municipal infrastructures may cause serious consequences to the society, environment, health, and economy. In this study, a systematic consequence-based decision-making framework is developed based on expert’s judgment or opinion. The causal relationships between different factors is established based on available data, previous literature, and expert judgment. To develop the consequence-based decision-making framework, Decision Making Trial and Evaluation Laboratory (DEMATEL) method is integrated with Interpretive Structure Modeling (ISM). The results of this study indicate that a complex causal relationship exists between the consequence factors. The developed framework can be used to prioritize the municipal infrastructures for maintenance, repair, rehabilitation, and replacement and to represent environmental, social, safety, and economic impacts.
10:30 Innovation in an Emerging Market: A Bibliometric and Latent Dirichlet Allocation Based Topic Modeling Study
Innovation has been recognized as an important factor influencing organizational performance. Malaysia as an emerging market has also been pushing for innovation to be the stimulant for growth. Due to huge number of research articles published related to innovation in Malaysia, there is a need to understand the existing state of research related to the topic. This study examines 1824 papers published from 1973 to 2019. Data was extracted from SCOPUS and analyzed using descriptive figures and tables. Additionally this study present the key features of topic modeling based on Latent Dirichlet Allocation (LDA) by extracting coherent research topics that are the focus of the papers analyzed. Through the topic modelling approach this study are able to extract ten coherent research topics from 1824 papers analyzed. This study aim to demonstrate themes related to research on innovation in an emerging market and contributes by summarizing the common keywords used in both the title and abstract of articles published until 2019. Furthermore, this study identified 10 topics based on the abstract of the published articles.
10:50 Performance Modeling of Sawmills using Artificial Intelligence
The measurement capabilities of the Data Envelopment Analysis (DEA) models are used to train the Artificial Neural Network (ANN) models for the best performance modeling of the sawmills in Ontario. The trained ANN models demonstrate promising results in predicting the relative efficiency scores and the optimal combination of the inputs and the outputs for three categories (large, medium and small) of sawmills in Ontario. The average absolute error in predicting the relative efficiency scores varies from 0.01 to 0.04, and the predicted optimal combination of the inputs (roundwood and employees) and the output (lumber) demonstrate that a large percentage of the sawmills shows less than 10% error in the prediction results. This study proposes an innovative ANN modeling approach that helps in continuous improvement of the forest industry working under an uncertain business environment.
11:10 Factors Influencing Ethical Decision Making: A view through Engineering Consultancy Firm in Malaysia
Nowadays, many unethical issues are surrounding the construction industry, such as professional negligence, fraud, conflict of interests, bribery and a few to be named. Engineers who are categorised as a professional group are highly exposed to this wrongdoing environment and are always facing an ethical dilemma when making decisions. The decision making made by individual construction practitioners in an organisational context, for instance, the managers of the engineering consultancy firms has certain effects on the ethical implications and consequences to the society. The purpose of this study is to investigate factors that affect ethical decision making in the construction industry, particularly in the engineering consultancy firm. The results provide the practitioners with a framework for understanding key drivers of ethical decision-making among the professional engineers employed in engineering consultancy firms.
S5G: Decision Making Using IoT and ML 7
10:10 Similarity Features For The Evaluation Of Obfuscation Effectiveness
Obfuscation is a technique to protect programs from analysis and reverse engineering. There is a problem of evaluating the effectiveness of such protection. In this work using machine learning to solve this problem is considered. As a consequence, the task of finding suitable program similarity features is being solved. The scheme based on symbolic execution for finding such features is built. The choice of symbolic execution for constructing similarity features is justified by the fact that such features can characterize the complexity of understanding the program in dynamic analysis. The experiments with the proposed scheme were done. The analyses of the given results are presented.
10:30 Traceability system using IoT and forecasting model for food supply chain
Ganjar Alfian, Muhammad Syafrudin, Norma Latif Latif Fitriyani and Jongtae Rhee (Dongguk University, Korea (South)); Muhammad Rifqi Maarif (Jenderal Achmad Yani University Yogyakarta, Indonesia); Imam Riadi (Universitas Ahmad Dahlan, Indonesia)
Nowadays, customer’s health awareness is of extreme significance. Food can become contaminated at any point during production, preparation and distribution. Therefore, it is of key importance for the perishable food supply chain to monitor the food quality and safety. Traceability system offers complete food information and therefore, it guarantees food quality and safety. The current study proposes IoT-based traceability system that utilized RFID and raspberry pi based sensors. The RFID reader is utilized to track and trace the product, while the raspberry pi is used to measure temperature and humidity during storage and transportation. In addition, the machine learning based forecasting model is utilized to predict future temperature, so that early warning can be presented by system if the predicted temperature exceeding the normal range. The results displayed that compared to the traditional methods, the proposed system is capable of tracking products as well as predicting sensor data accurately and effectively.
10:50 A Processing Delay Tolerant Workflow Management in Cloud-Fog Computing Environment (DTWM_CfS)
K J Naik (National Institute of Technology Raipur, India)
Distributing the IoT workflows tasks among the fog and cloud nodes became an important issue now a day. To accomplish with this issue, a heuristic-based delay tolerant task scheduling and workflow management system for Cloud-fog computing (DTWM_CfS) was proposed in this work. The underlying resources are utilized efficiently with the proposed system and reduce the average makespan time, total cost by increasing the CMT ratio
11:10 Decision Making Based On Machine Learning Algorithm For Identifying Failure Rates In The Oil Transportation Pipeline
In Oil Industry Safety Directorate (OISD), it has been testified that nearly 33% of pipeline defects are due to improper pigging and improper precast-forecasting of the existence of a crack in the long run pipelines. Therefore, pipeline engineers are requisite to exploit effective and proficient intelligent approach to identify and pinpoint these pipeline imperfections. To sort out these issues, an unsupervised machine learning technique with partition clustering algorithm is implemented to figure out the occurrence of crack or sedimentation inside the pipelines in the premature stage during the long run passage of oil through pipelines. As a result, partition clustering best fits for the observation of performance by organizing clusters as in spherical shapes which affords similarity within the cluster is higher and the similarity between the clusters is minimum. In the proposed work, for the prediction on the occurrence of anomaly in oil pipeline system the well-suit partitioning cluster approach is combined with K-means clustering.
S5H: Sustainable Decisions for a Sustainable Development 5
10:10 Machine Learning Techniques for Determining Students’ Academic Performance: A Sustainable Development Case for Engineering Education
Sujan Poudyal and Morteza Nagahi (Mississippi State University, USA); Mohammad Nagahisarchoghaei (University of North Carolina at Charlotte, USA); Ghodsieh Ghanbari (Mississippi State University, USA)
This research paper presents the approach of machine learning analysis techniques on education. For the large dataset, data mining techniques are used to extract hidden information and create insight. We hypothesized that the prediction algorithm and dimensional reduction algorithm could be used on an educational dataset to extract the hidden information and analyze the information to create insight. Machine learning algorithms can be used to predict student academic performance. Since some of the features in our dataset are correlated so, before applying the prediction algorithm, we applied the dimensional reduction algorithm to reduce the dimension of our dataset and extract the important features. For the prediction analysis, we used three supervised machine learning algorithms, namely K-Nearest Neighbors (KNN), Decision Tree, and Logistic Regression. Before we applied these machine learning algorithms, we applied the dimension reduction algorithm for the feature extraction purpose using two algorithms, namely Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). We compared the performance of these machine learning algorithms. For the student academic performance, their final Examination result was taken as the target value, which was predicted by using the above-mentioned supervised algorithm. Our work shows that the dimensional reduction algorithm, followed by the prediction algorithm, achieved the acceptable prediction accuracy for determining student academic performance. Our result also highlights the advantage of employing machine learning techniques on educational data and explains how it helps to provide engineering education insight for the sustainable development of Engineering education as a whole.
10:30 Effect of Individual Differences in Predicting Engineering Students’ Performance: A Case of Education for Sustainable Development
Morteza Nagahi and Raed Jaradat (Mississippi State University, USA); Mohammad Nagahisarchoghaei (University of North Carolina at Charlotte, USA); Ghodsieh Ghanbari and Sujan Poudyal (Mississippi State University, USA); Simon Goerger (Engineer Research and Development Center & US Army, USA)
The academic performance of engineering students continues to receive attention in the literature. Despite that, there is a lack of studies in the literature investigating the simultaneous relationship between students’ systems thinking (ST) skills, Five-Factor Model (FFM) personality traits, proactive personality scale, academic, demographic, family background factors, and their potential impact on academic performance. Three established instruments, namely, ST skills instrument with seven dimensions, FFM traits with five dimensions, and proactive personality with one dimension, along with a demographic survey, have been administrated for data collection. A cross-sectional web-based study applying Qualtrics has been developed to gather data from engineering students. To demonstrate the prediction power of the ST skills, FFM traits, proactive personality, academic, demographics, and family background factors on the academic performance of engineering students, two unsupervised learning algorithms applied. The study results identify that these unsupervised algorithms succeeded to cluster engineering students’ performance regarding primary skills and characteristics. In other words, the variables used in this study are able to predict the academic performance of engineering students. This study also has provided significant implications and contributions to engineering education and education sustainable development bodies of knowledge. First, the study presents a better perception of engineering students’ academic performance. The aim is to assist educators, teachers, mentors, college authorities, and other involved parties to discover students’ individual differences for a more efficient education and guidance environment. Second, by a closer examination at the level of systemic thinking and its connection with FFM traits, proactive personality, academic, and demographic characteristics, understanding engineering students’ skillset would be assisted better in the domain of sustainable education.
10:50 Comparative analysis on carbon footprint of hydrogen fuel cell and battery electric vehicles based on the GREET model
Eugene Wong and Danny Ho (The Hang Seng University of Hong Kong, Hong Kong); Stuart So (IEEE Hong Kong Section & Technology Management Chapter, Hong Kong); Alex Tsang (Technology and Higher Institute of Hong Kong (THEi), Hong Kong); Eve Chan (Technological and Higher Education Institute of Hong Kong (THEi), Hong Kong)
Facing global warming and new policies on banning the use of diesel in vehicles, there is a growing need in developing vehicles using renewable energy to mitigate carbon emissions in the transport and logistics sector. Among different forms of non-fossil energy for vehicles, hydrogen-powered fuel cell emerges as a promising way to combat global warming. However, previous studies on vehicle carbon emissions focus mainly on diesel and electric vehicles (EVs). Besides, as most of emission assessment methodologies on products are developed for fast-moving consumer goods, there is insufficient research on the product carbon footprint (PCF) of hydrogen-powered vehicles. To address that gap, this study evaluates the life-cycle assessment (LCA) process of hydrogen fuel cell vehicle and compare the PCF of an EV (Tesla Model 3) and a hydrogen fuel cell car (Toyota MIRAI) based on the life cycle assessment model of GREET (Greenhouse gases, Regulated Emissions, and Energy use in Transportation). This study found that that the fuel cycle of GREET1 contributes significantly to the vehicle’s PCF. Besides, higher transparency in disclosure of relevant data in the PCF methodology adopted by vehicle manufacturers is needed for making comparison of their vehicles’ emissions possible. Future research needs to examine the best practices of PCF calculation and reporting for new energy vehicles including private cars and trucks. Future development on simulating various types of hydrogen fuel cells for vehicles and their corresponding emissions are suggested.
11:10 Experimental characterization of dynamic behavior of lab scale vibrating sieve using identified models
Mateus Sousa Freitas (Mosaic Fertilizantes, Brazil); Anderson Lima de Menezes and Vinicius Pimenta Barbosa (Federal University of Uberlandia, Brazil); Estevão dos Santos Gedraite (Siemens, Brazil); Carlos Henrique Ataíde and Rubens Gedraite (Federal University of Uberlandia, Brazil)
Many industrial operations involve the separation of suspensions of particulate systems, being the vibrating screen, one of the most versatile equipment available for this task. Despite its large scale use in several areas, there are few experimental studies related to the dynamics of the operation of this equipment. The purpose of this study was to characterize the dynamic behavior of the sieve, evaluating the moisture of the retained solid. Using an experimental unit and a simulating fluid formed by sand, water and xanthan gum, semi empirical identified models for transient responses were obtained. The dynamic models were obtained by the analysis of the sieve response to individual step changes in independent variables g-force and concentration of solids fed considering two values of feed flow and first order plus dead time transfer functions. The moisture content in the retained stream was evaluated after the stabilization of the system response considering the new steady state. The best operating point of the sieve that permits the lowest moisture content was identified as 3.0 g-force; 5.0% concentration and 13 kg/min flow rate. Considering the experiments realized, it was verified that a decrease in the concentration of solids in the fed stream promotes an increasing in the amount of moisture of the retained solid. On the other hand, feed flow had little influence on the response of the equipment. The semi empirical identified models represented adequately the system dynamics.
S6A: Decision Aid in Logistics and Engineering 7
11:40 A Practical Scheduling Optimizer for Plastic Injection Molding Facilities
Scheduling of plastic injection molding facilities has been a very important issue in operations research and optimization field. This paper presents the development and experimentation of new optimization approach for production planning and scheduling. It focuses on parallel machine scheduling problems in plastic injection molding facility in Egypt. A genetic algorithm, a cuckoo search, and a mixed-integer linear programming (MILP) approach were adapted to solve the problem. Multiple experiments were conducted to compare these approaches. An improved MILP solution approach was proposed, and it outperformed the other optimizers in terms of solution accuracy and computational time. Based on this work, a novel scheduling tool based on MATLAB and MS-Excel was developed to help practitioners generate optimum schedules with less computational time. The authors addressed the scheduling optimization problem from both theoretical and practical perspectives and the outcomes of this research are beneficial for both research and industry.
12:06 A Framework for Risk Management of Large-Scale Organisation Supply Chains
This paper establishes a novel approach to supply chain risk management (SCRM), through establishing a risk assessment framework addressing the importance of SCRM and supply chain visibility (SCV). Through a quantitative assessment and empirical evidence, the paper also discusses the specific risks within the manufacturing industry. Based on survey data collected and a case study from Asia, the paper finds that supplier delays and poor product quality can be considered as prevailing risks relevant to the manufacturing industry. However, as supply chain risks are inter-related, one must increase supply chain visibility to fully consider risk causes that ultimately lead to the risk effects. The framework established can be applied to different industries with the view to inform organisations on prevailing risks and prompt motivate improvement in supply chain visibility, thereby, modify risk management strategies. Through suggesting possible risk sources, organisations can adopt proactive risk mitigation strategies so as to more efficiently manage their exposure.
12:33 Backwater calculation due to Arc Bridges with skewed crossing
This paper proposes a simple mathematical formula for estimating bridge backwater for a skewed multiple opening arch bridge. The impacts of different parameters that affect the backwater were studied by conducting a series of parametric studies. The results of the parametric study were incorporated into the proposed method in order to enhance the accuracy of backwater computation. The proposed method was then validated by comparing its results with experimental data and the results of the most commonly used energy method. The comparisons showed that the results of the proposed method have good correlations with both experimental data and the energy method results. In addition, it was concluded that the proposed method can be very helpful to hydraulic engineers due to its simplicity.
S6B: Medical Decision Making 6
11:40 Performance Evaluation of an Ensemble Method for Diagnosis of Chronic Kidney Disease with Feature Selection Technique
Olayinka Ayodele Jongbo (Ekiti State University, Ado Ekiti, Nigeria); Toluwase Ayobami Olowookere (Redeemer’s University, Ede, Nigeria); Adebayo Adetunmbi (Federal University of Technology, Akure, Nigeria)
Chronic Kidney Disease (CKD) is a public health issue which is seen as a major threat to human life due to abnormal functioning of kidney over a period of months or years which if left untreated may damage vital organs in the body leading to an increase rate in cardiovascular mortality which may resulted in sudden death if not early detected. Data mining techniques are employed in several clinical diagnoses for making intelligent diagnostics decision that can be applied in diseases prediction. The performances of these techniques are very promising in the management of different ailments to reduce the high numbers of people that die yearly due to inaccurate diagnosis of numerous disease conditions. This study evaluates performance of a bagging ensemble technique on CKD dataset with effective feature selection technique to yield a reliable and accurate predictive model capable of correctly classifying diseased from non-diseased patients. The study was investigated on real patient dataset obtained from UCI machine learning repository consisting of 400 instances with 24 conditional attributes and a decision class. Random forest algorithm was used as a measure to select the best subset of features for the predictive models. Naïve Bayes, k-Nearest Neighbor and Decision Tree algorithms serve as the base classifiers whose performance were aggregated using bagging ensemble approach to improve base learners’ performances. Results obtained from the study showed the effect of feature selection and ensemble technique in improving the accuracy of data mining classification algorithms. Optimal result of the model is achieved using 7 best selected features on ensemble classifier with 1.000 accuracy of CKD diagnosis than the result obtained without using feature selection on the ensemble model which gives accuracy of 0.983. Hence, making the model suitable for efficient diagnosis of CKD.
12:00 Data Imputation-Based Learning Models for Prediction of Diabetes
The accuracy of automated diabetes prediction models using the past health record of the patient is highly dependent on the correctness of the used data. If patient data is inconsistent and contains lots of missing values, then the prediction is more challenging. In this paper, the impact of missing value imputation (MVI) techniques is evaluated in diabetes prediction with existing missing values. The experiments are performed on the Pima Indians diabetes dataset, which contains many missing values. In this paper, first, MVI techniques are used for handling the missing values. Second, K-Means clustering is used to analyze the best imputation technique based on the percentage of incorrectly classified instances in each imputed dataset. Third, principal component analysis (PCA) is used for feature extraction, and Info Gain is used for selecting the optimal set of features. Six different classification models, such as multi-layer perceptron (MLP), support vector machine (SVM), Naïve Bayes (NB), decision tree (J48), AdaBoost, and Bagging are used for experiments. Eight different techniques such as CMC, Case Deletion, KMI, SVMI, WKNNI, KNNI, FKMI, and MC are used for missing value imputation. The experimental result shows that case deletion and KMI imputed datasets have the lowest number of incorrectly classified instances. On these two datasets, when to six classifiers are applied, we obtained that MLP classifier attained the highest accuracy of 98.9967 % with the case deletion imputed dataset and accuracy of 99.2767% with the KMI imputed dataset when six principal components are used. The other classifiers used in comparison obtained accuracies ranging between 93% – 98%.
12:20 Analysis of Dengue Fever Transmission Dynamics with Multiple Controls: A Mathematical Approach
Afeez Abidemi (Federal University of Technology Akure, Nigeria); Hammed Olawale Fatoyinbo (Massey University, New Zealand); Joshua Asamoah (Complex Systems Research Center, Shanxi University Taiyuan, China)
Dengue is a mosquito-borne viral infection caused by the dengue viruses of four serotypes, DENV-1,2,3,4. The disease is transmitted by the bites of infected female Aedes mosquitoes. This paper presents a compartmental deterministic model including human prevention and vector control interventions for the dynamics of dengue fever spread. Theoretical analysis of the model is conducted to obtain the associated dengue-free equilibrium. The next generation matrix method is used to calculate the effective reproduction number. The local stability analysis of the dengue-free equilibrium is presented. Numerical simulations of different strategies of control combination are considered, and how they impact the dynamical behaviour of the system are analysed. We found that dengue prevalence can be reduced in a community by implementing any control intervention which combines human prevention and vector control measures.
12:40 A real-time application to detect human voice disorders
Manisha M.G. Milani (Universiti Brunei Darussalam, Gadong, Sri Lanka); Murugaiya Ramashini (Universiti Brunei Darussalam, Brunei Darussalam & Uva Wellassa University, Sri Lanka); Krishani Murugiah (Pavendar Bharathidasan College of Engineering and Technology India, India)
This paper proposes an automatic method to differentiate healthy and pathological human voices in real-time to construct more accurate medical decisions and allow patients with disorders to seek early medical assistance. The main problem encountered in voice recognition is to identify the appropriate features and a classifier. To address a solution to this problem, this paper introduced a fast and accurate voice recognition method by approaching the MFCC audio processing features with three machine learning algorithms; Decision Tree (DT), Support Vector Machine (SVM), and Artificial Neural Network (ANN). As a result, ANN classifier performed the highest classification accuracy of 87.5%, whilst DT and SVM performed 62.5% and 50%, respectively.
S6C: Computerized Decision Aid 6
11:40 Designed Artifacts for Analyzing and Evaluating Children with Autism Spectrum Disorder (ASD) in Sri Lanka
In the fast-moving world of today, the recent statistics of population patron, 1 in 63 children are affected with Autism. Autism is a Neurodevelopmental disorder of early childhood, and it is a condition that occurs due to the abnormal growth of mind, where these children exhibit extra-ordinary behavioral patterns. Up-to-date there is no well-defined treatment for Autism Spectrum Disorder (ASD). In this study, an ICT based artifact (specifically, a software package) has been introduced as a novel approach, with the objective of differentiate neurotypical and the ASD children’s behavior. These artifacts are designed by considering three main impaired areas of ASD which are Eye Contact, Maturity Level, and Intelligence Level. Therefore, the developed software package is comprised of an Eye Movement Tracking tool, Maturity Level Analyzer and Intelligence Level Measuring tool. In the Eye Movement Tracking tool, a common sample video is shown to two different types of participants (Neurotypical Children and ASD Children) and their eye movements during the video play was recorded. Then the recorded data was processed using symmetric mass center algorithm, and displayed the results in three different graphs: Heat Map, Dot Plot and Linear graph. The tool developed for identifying the Maturity Level, provides a drawing canvas where participants are allowed to draw given shapes. Then, analysis of the drawing is done by comparing standard deviation (SD) of both participants’ data. Based on the results, there is a significant different between the drawing of two types of participant as SD of the drawing made by neurotypical children was withing -4 to +4 (normal range) whereas drawings made by ASD children deviate from this range. Intelligence level Measuring Tool compromised with color and number-based activities where participants’ responses are taken to decide child’s condition by the psychiatric or parent. These tools were developed using Human Computer Interaction methods. Testing and evaluation of the system were done with three (3) ASD patients and ten (10) neurotypical persons from the age groups of 3-5 years. To evaluate the accuracy and performance of the tools, two tests as ‘pre-test’ and ‘post-test’ were done with the participants. By analyzing the tested result, it has shown that these set of tools is more effective for taking decisions in ASD prediction, and measuring the improvements of ASD children with treatments.
12:00 Combining Convolutional Neural Network and Long Short-Term Memory to Classify Sinusitis
As one of the common health problems, sinusitis is inflammation of the mucous membranes lining one or more of the paranasal sinuses. Improvement of detection tools to classify acute or chronic sinusitis is required because of its impact on the patient’s treatment. In some of the previous research, deep learning has demonstrated good accuracy to classify disease. Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) are now popularly used for deep learning tasks. This research applied one dimensional (1D) CNN and its advanced modification with LSTM called 1D CNN-LSTM to classify the type of sinusitis. Data sinusitis patients are received from Cipto Mangunkusumo Hospital, Jakarta, Indonesia. This dataset consists of 200 data with four features, such as Gender, Age, Hounsfield Unit (HU), and Air Cavity. The result is 1D CNN-LSTM has higher accuracy than 1D CNN with 98,33% of accuracy.
12:20 Machine intelligence vs. human intelligence in geological interpretation of seismic data: an example from Northwestern Australia Poseidon field
Seismic data are rich in information about the structure of geological formations and their pore content. Seismic attributes are any information that can be derived or computed from seismic data. Over the past decade, the increasing number of seismic attributes itself becomes a challenge for interpreter and even for machines to accomplish a reliable interpretation. As in oil exploration wrong decisions may result in costly dry wells, a good understanding of the seismic attributes and the limitations of the machine is of great importance. In this paper, we conduct a brief comparison between deterministic approaches by human and statistical approaches established by machine intelligence. Then, a real example from North Western Australia is presented where seismic attributes are used to find hydrocarbon-saturated rocks and discriminate them from their surrounding rocks. Machine learning, namely, artificial neural network (ANN) is used to combine the seismic attributes to produce hydrocarbon probability cube from seismic data. A number of seismic attributes are used in the training of the neural network. Further, we propose a new seismic attribute combining the gradient and the scaled-Poisson reflectivity which are both sensitive to fluid. The new attribute showed a better detection than the neural network. To benefit from the statistical relationship determined by the ANN and from the proposed deterministic attribute we added the attribute to the ANN training. A better detection of the gas-saturated reservoir is achieved. We conclude that that machine learning algorithms can reduce the workload and save computation time but they need assistance from interpreter to make good decisions.
12:40 Deep Neural Network-Based Approach to Identify the Crime Related Twitter Posts
Chamith Sanadagiri (Sabaragamuwa University of Sri Lanka, Sri Lanka); Banage Kumara (Sabaragamuwa University of Sri Lanka, Sri Lanka & University of Aizu, Japan); Banujan Kuhaneswaran (Sabaragamuwa University of Sri Lanka, Sri Lanka)
Crime prediction is an important task to reduce criminal activities in the society. We can minimize the harm by identifying the crimes hotspot before any crime happens using a crime prediction model. Thus, crime prediction is becoming a hot topic among researches. We can identify crime patterns by analyzing historical crime data and predict future crimes. Researchers used different sources to get crimes related data to generate the prediction model. But, some crimes are unregistered. In this paper, we used twitter posts to detect crimes. People share information around their environment via Twitter posts. Here, we proposed the Bidirectional Encoder Representations from Transformers (BERT) approach to identify the crime-related posts. Our approach outperformed the existing approaches by obtaining 92.8% accuracy.
S6D: Energy Management Decisions 4
11:40 Promoting the Benefits of ESCO Projects at University of Bahrain
Energy service companies (ESCOs) offer many benefits for upscale energy efficiency and potentially contributing to energy savings in the government and private sectors. In Bahrain, ESCO is relatively a new concept in the business market, and there is minimal experience in implementing ESCOs projects. Energy saving and energy efficiency motivate all United Nation members to follow sustainability development goals. The purpose of the paper is to promote the importance of ESCOs projects in Bahrain and to increase awareness and spread the culture of energy efficiency projects through ESCOs, while considering the University of Bahrain as a role model for such projects toward sustainability Development.
12:00 Natural Gas Pipeline Failure Risk Prediction and Relation Analysis by Combining Rough-AHP and Rough DEMATEL Method
The main purpose of this research is to identify the most crucial factors, accordingly improve the security management and reduce the potential failure risks. In our daily life, natural gas is widely used for manufacturing industry and household activities such as heating cocking and production of electricity. Last two decades natural gas consumption rate has been increasing exponentially in all over the world. As a result, day by day energy Provider Company gets pressure to supply safe and reliable distribution from the point of source to the specific consumer. Any kind of pipeline failure may cause catastrophic disaster i.e. human casualties, financial penalty, delay of manufacturing goods production and environmental pollution. Subsequently with help of Rough Analytic Hierarchy Process (Rough-AHP) the energy providers can analyze failure rank order according to the importance. In addition Rough-Decision making and Evaluation Laboratory (Rough DEMATEL) methods can analyze the cause-effect relation among gas pipeline failure criteria. Therefore, the energy supplier company can take necessary action plan and reduce the potential pipeline failure risk. As well as, company can estimate the budget for maintenance program based on priority.
12:20 Dailly Forecasting of Photovoltaic Power Using Non- Linear Auto-Regressive Exogenous Method
Accurate and realistic models of forecasting are very relevant as methods for the successful implementation of renewable energy sources into smart grids. This work, present a short-term forecasting of the power output of building photovoltaic system installed in the Heliopolis university, Cairo, Egypt. The proposed method forecast one day using the auto- regressive exogenous method, based in neural network times series and based in the exogenous input data of the neural networks. The neural network is trained with Bayesian algorithm to attain the best performance of the forecasted results. However, the NARX neural networks are efficient to overcome a variety of problems and are common for nonlinear control applications as well as for the forecasting. The NARX model opts for accelerated training and level of convergence and good representation and is characterized by attractive dynamics and sensitivity to interference. This model differs in the use of the exogenous neural network input or exogenous variables. The performance of the NARX approach was analyzed as a function of training data sets, error descriptions based on experimental results of the photovoltaic system. The NARX results is compared with experimental data and neural network method. In addition, a statistical criterion is used to evaluate the effectiveness and performance of NARX model. NARX method performed the neural network in term of the statistical criterion which has a low value. Generally, the efficiency and precision of the results depend heavily on the environmental conditions and the data input.
12:40 Statistical Fitting of Wind Speed Data for Determination of Wind Power Potentials in Saudi Arabia
This paper present a statistical analysis of wind speed data for the Kingdom of Saudi Arabia. The study aim to determine the best probability distribution function for the wind speed data of some selected sites in the Kingdom. This is achieved by analysing a few parameter distribution functions. The probability distribution functions used in the analyses were Weibull, gamma, and Rayleigh distributions. In order to determine the best fits, a mean annual error was used. Results shows that the Weibull distribution has a 0.000601 annual average error, followed by the gamma distribution with an annual error of 0.362221, and finally, the Rayleigh distribution with an annual distribution error 0.63323. This indicates that the sites under study are suitable for electrical power generation in both grid-connected and off-grid modes. Comparing the annual error of each site shows that the gamma distribution performs better compared to other distributions. The results determined the best site for wind speed and very important for various microgrids designs in the Kingdom of Saudi Arabia.
S6E: Decision Making Using IoT and ML 8
11:40 Internet of Things: A Review of Architecture and Protocols
IoT is a network of heterogeneous devices that are connected through the Internet. IoT network is growing rapidly in every field such as healthcare, military, agriculture, traffic management, and education, etc. In the next era, maximum communication will be through IoT. The IoT applications are increasing comfort, automated work, and efficiency for the users. These networks are in the form of wireless, wired, and ad-hoc networks. The main goal of IoT is to provide interconnectivity seamlessly by connecting anybody to anything, anywhere at any place by any network. Protocols play a major role in interconnectivity, heterogeneity, handling dynamic changes, security, and management of the number of devices connecting to IoT. This paper presents the architecture of IoT and services provided by different protocols.
12:00 Research paper classification based on Word2vec and community discovery
With the advances in information technologies, different research areas are emerging day by day and thousands of research papers are published in these fields. Papers are not presented to users grouped according to their topics. Therefore, it is getting harder and harder for people to access research papers in their field of interest. To facilitate this search process, a paper classification system is proposed in this study to categorize papers on similar topics. A different approach from the studies presented for classification so far has been proposed. This system is a complex system created by combining Word2vec, network modeling and community discovery. With the Word2vec method, which has attracted great attention recently, paper similarities have been found, a network based on similarity rates has been created and papers have been clustered with the community discovery algorithm. As a result of the application with the proposed system, a success of 89% has been achieved. As can be seen from the results, this approach presented for an important classification problem will provide great convenience to people. It will enable fast and efficient access to papers.
12:20 Automatic Term Extraction on Turkish Scientific Texts
In order for a text or collection to be understood, it is very important to understand the terms contained in it. In this study, it is aimed to detect terms in a domain-specific (Cyber Security) corpus. A two-layer method is suggested for the determination of the terms used in single words or phrases. Term candidate words are determined by statistical methods in the first layer. In the second layer, the possibility of using these words in phrases with semantic approaches is checked. In the study, Word2Vec approach was used to determine semantic affinity and 3 different datasets were used. The results show that the terms used in singular or binary patterns were successfully determined using the proposed method.
12:40 A Sentiment Analysis Study on Recognition of Facial Expressions: Gauss and Canny Methods
Human-computer interaction has been the focus of today’s current researches. Human-computer interaction is accepted as a multidisciplinary field that takes place through interfaces. These interfaces can sometimes be software functions, or sometimes they can be interact provided with hardware components. Facial expressions give information about people’s emotions play an important role in sentiment recognition. Today, facial expressions are used in many fields such as education, psychological studies, virtual reality, robotics, facial animation, health and law, and the need for analysis of facial expressions in many areas is increasing. In addition, the analysis of human facial expressions with computers is a remarkable research area, but it is considered a challenging problem. In this context, it is necessary to analyze facial expressions accurately and quickly by software. In this study, sentiment recognition from facial expressions (sad, happy, scared, confused) was performed using 50 different images obtained from various databases and internet sources. With digital image processing techniques, improved images can be obtained and feature extraction can be made. In this research, digital image processing functions and MATLAB programming language of MATLAB 2018 program, which provides advanced programming for scientific studies, were used. Image noise was removed with the Gauss filter, and edge detection operations were performed with the Canny method. Geometric ratios were used to eliminate errors. As a result of the study, it was determined that sentiment recognition procedures performed on images with similar facial expressions made incorrect sentiment classification. However, it has been observed that face recognition with MATLAB functions and MATLAB programming has generally produced successful results.
S6F: Economics and Investment Decision Aid 1
11:40 Multicriteria Assessment of the Creative-Innovative Potential of Brazilian Cities
Victor D. H. de Carvalho and Jaime Cirilo (Universidade Federal de Alagoas, Brazil); Felippe de Barros (Federação do Comércio de Bens, Serviços e Turismo do Estado de Alagoas, Brazil); Thyago Nepomuceno (Universidade Federal de Pernambuco, Brazil)
This paper presents the initial results of a creative-innovative potential assessment in a group of Brazilian cities. The proposal is based on two pillars: open data platforms and multicriteria analysis. Through the open data platforms it was possible to get data and dimension the criteria and related indicators used in the study, while the application of the multicriteria methods PROMETHEE I and II created a ranking of the cities from the perspective of their creativity and innovativeness.
12:00 Which will serve better as a hedge or diversifier Gold or Bitcoin?
Md. Jamal Hossain (Universiti Sains Malaysia, Malaysia & Noakhali Science and Technology University, Bangladesh); Mohd Tahir Ismail (Universiti Sains Malaysia, Malaysia); Sadia Akter (University of Rajshahi, Bangladesh); Mohammad Raquibul Hossain (Universiti Sains Malaysia, Malaysia)
Gold is the most attractive precious metal and thus for the investor’s first choice of an alternative investment. After the establishment of Bitcoin becomes one of the alternative choices of investment. Both Gold and Bitcoin have attractive features of risk management and portfolio risk. In this paper, we compared Gold and Bitcoin performances as hedge or diversifiers and tried to isolate the best one. For comparison purposes, we used the same methodology, which is the asymmetric GARCH model considering three different markets, such as the energy market, currency market, and stock market. We found Gold serves as a diversifier against crude oil WTI and hedge against S&P500, which is short-run. Bitcoin only offers diversifier benefits against crude oil WTI. For portfolio analysis, Gold is better than Bitcoin; therefore, it must be the first choice for investors.
12:20 The Impact of Trade and Financial Openness on Banks Financial Development in GCC Countries
study examines the impact of trade and financial openness on banks’ financial development in the Gulf Cooperation Council (GCC) countries. The study used panel dataset of 67 key banks from GCC countries over the period 2009-2018. Banks’ financial development is a dependent variable measured by three sets of indicators which are the cost, volume of banks’ credit, and banks’ risk-taking. Fixed effect regression model shows that both trade openness and financial openness could promote GCC countries’ bank’s financial development, where both of them affect significantly on the cost and volume of banks’ credit variables, but they have insignificant impact on banks’ risk-taking. The Authors recommend that simultaneous opening of trade and finance is a more guaranteed way to improved financial development in GCC. This could be happen when the policy makers emphasize trade and financial openness at the core of national development strategies; adopt trade facilitation, logistics, and border management to help their economies to integrate into global value chains through targeted reforms and investments; advice and support implementation of commitments made through trade openness agreements.
12:40 Fiscal and Monetary Policy, Trade Openness, and its Impact on Indonesian Exchange Rate
Indonesian rupiah has been quite volatile in recent years. There are many factors determined the fluctuation of rupiah. It is necessary to study which of these variables including fiscal policy, monetary policy, and international trade policy affect the fluctuation of the rupiah exchange rate. ARDL (Autoregressive Distributed Lag) model with quarterly data from 2008 to 2018 are used to study the effect in the short and long term between the variables studied. The results showed that fiscal policy, the monetary policy and trade openness are co-integrated but partially does not affect the exchange rate in the long run. On the other hand, in the short term, these variables greatly influence the fluctuation of the rupiah exchange rate. Therefore, it can be suggested that firstly, the financing of the fiscal deficit with Sovereign Debt Instruments is better than foreign debt, given the different effects on the rupiah exchange rate. Second, the monetary authority must consider domestic interest to be more competitive and consider conditions in the real sector. Lastly, the weakening of the exchange rate due to increased imports can be an opportunity for exporters to improve export performance, so that the exchange rate will stabilize again.
S6G: Decision Making Using IoT and ML 9
11:40 A Performance Analysis of Deep Convolutional Neural Networks using Kuzushiji Character Recognition
The recent increase in computational power and data available to the general public has given rise to the development of a plethora of highly performant deep neural network architectures. A popular and successful application of these has been in the field of image recognition, in which many yield promising results. Yet, there are differences and caveats in each, and knowing the right one to choose for a particular task can save a lot of money, time, and computational resources. In order to take a closer look at the differences in their performance, we have compared 5 of the many top architectures, namely the VGG16, VGG19, DenseNet121, ResNet50, and ResNet152V2 in terms of their error percentage, class-balanced error percentage, standard deviation of class-balanced accuracy, and training time. To achieve this, we have used 2 datasets, the Kuzushiji MNIST or KMNIST dataset, which is a balanced dataset with 10 classes and the Kuzushiji 49 or K49 dataset, an imbalanced dataset with 49 classes, both of which comprise cursive characters from the Japanese Hiragana script. All of the results were computed and verified using 10 fold cross validation. For the Kuzushiji MNIST dataset, the VGG16 showed the best results in all of the aspects. For the Kuzushiji 49 dataset, the DenseNet121 showed the highest performance in all aspects other than training time, and the latter was least for the VGG16.
12:00 Optimizing Ambulance Paths: Case Study of Red Cross Tripoli Branch
Red Cross, an international humanitarian organization, is involved in helping people suffering from hunger, war, diseases, or other problems. Reaching people who need help in using the own ambulances of the Red Cross. Due to congestion problems in Lebanon, the ambulances are sometimes late which threatens people’s life. This paper aims to help the ambulances of the Red Cross in Tripoli Branch to promptly reach their urgencies in case of congestion and not. The problem is solved using Wolfram Mathematica 8.0 where the shortest path, in duration, is found. In the end, the shortest path is selected to be used by the Red Cross when there is congestion and another one is selected when there is not.
12:20 Handling Unknown Words in Neural Machine Translation System
The corpus-based approach is an emerging approach to develop the machine translation system nowadays. Statistical Machine Translation(SMT) and Neural Machine Translation(NMT) are two corpus-based systems. NMT yields better results as compared to the traditional rule-based approach as well as a statistical-based approach. The computation complexity of the NMT system is more as compared to the SMT system due to the use of softmax function at the output layer of NMT. Due to the constraint of complexity, NMT uses fixed vocabulary, but Machine Translation (MT) is an open problem. This causes the out-of-vocabulary (OOV) in the predictions of the NMT system. To overcome these OOV words in NMT, Word Embedding (WE) has been used in Our NMT model for Punjabi to English. With WE, Byte-Pair-Encoding (BPE) has also been used to increase the effectiveness of the overall system. The system has been evaluated by using the automated evaluation tools BLEU score and Translation Error Rate (TER) score.
12:40 Minimum-Cost Flight Package Development Using Dijkstra’s Shortest Path
In order to save money and get more trips, many websites and companies give their customers a comparison between the flying companies to help them choose the cheapest trip. These kinds of services are limited because they give the customer a comparison between flying companies only but more tricks can help the customer save more money. One of these tricks is to travel to the destination from another country instead of traveling directly from the native country where the ticket can be more costly than what it can cost from another one. This article gives a new solution for saving money during traveling. The new solution consist of studying all the trips cost between all the airports in the world and all the flying companies, and with all these data, the system will choose the cheapest trips sequence between all these trips to get the customer to their destination from the original country. This project provides the customers the cheapest path a user can use in their trip using a traditional shortest path algorithm, Dijkstra’s algorithm, to save money as much as they can. This article studies a problem case where a customer wants to travel from Beirut to New York City and the system will help decision-makers in making the right decision that saves money on the trip.
S6H: Financial Decision Making 2
11:40 Comparative performance of Islamic Market Index based on Optimized Renko Method
Mohammad Omar Farooq (University of Bahrain, Bahrain)
Renko is a Japanese style charting, popular to many users of technical analysis. In this paper the method is used to study the performance of Dow Jones Industrial Average comparing with Islamic Market Index. Results show that the method is equally effective in both types of index and can produce above-average return on longer time frame for Islamic index too.
12:00 Application of GA Feature Selection on Naïve Bayes, Random Forest and SVM for Credit Card Fraud Detection
Yakub Kayode Saheed (Al-Hikmah University, Ilorin, Nigeria); Moshood Hambali (Federal University Wukari, Nigeria); Micheal Olaolu Arowolo, MOA (Landmark University, Omu-Aran, Nigeria); Yinusa Olasupo (Federal University Wukari, Nigeria)
Credit Card Fraud (CCF) is a significant problem facing credit card holder and the credit card delivering companies in the past decades. There are two levels CCF are performed, the transaction level frauds and application level frauds. The focus of this paper is at the application level of CCF detection using Genetic Algorithm (GA) as a feature selection technique. The GA feature selection technique is in two phases, the first phase is designated as the priority features where eight (8) attributes were selected as the fittest attributes. The second phase is referred to as the second priority features where another set of eight (8) attributes are considered and selected. The Naïve Bayes (NB), Random Forest (RF) and Support Vector Machine (SVM) supervised machine learning techniques are used for the detection of CCF on German credit card dataset which is an imbalance dataset. The experimental findings of the proposed model revealed that the first priority features are the most important features. Also, the obtained results showed that the RF algorithm outperformed NB and SVM in terms of accuracy, fraud detection rate and precision.
12:20 Stock price forecast with deep learning
In this paper, we compare various approaches to stock price prediction using neural networks. We analyze the performance fully connected, convolutional, and recurrent architectures in predicting the next day value of S&P 500 index based on its previous values. We further expand our analysis by including three different optimization techniques: Stochastic Gradient Descent, Root Mean Square Propagation, and Adaptive Moment Estimation. The numerical experiments reveal that a single layer recurrent neural network with RMSprop optimizer produces optimal results with validation and test Mean Absolute Error of 0.0150 and 0.0148 respectively.
12:40 How to choose an online financial product
Internet finance products have been considered as one significant driving force in China’s economy since they emerged. This paper is an attempt to explore the essence of internet installment payment, to analyze the interest rates of various internet financial products, including but not limited to credit card, second-hand car and new car. It also surveys and compares the declared interest rates of several popular platforms and their ‘effective annual interest rates’. The results suggest that the ‘effective annual interest rates’ is usually much higher than what platforms have declared. Such dedicated difference between the ‘effective annual interest rates’ and what the platforms declared can easily mislead customers to make a decision. The paper aims to articulate such dedicated differences so that borrowers and regulators can have more insights into the business strategy and make wiser decisions.
S7A: Decision Aid in Logistics and Engineering 8
14:30 Preference disaggregation in ARAS-H method for road safety problem in Tunisia
Road safety is a major problem to deal with due to its impact on human lives. Indeed, implementing a straight and a clear road safety strategy leads to a decrease in road accidents and therefore its resulting injuries and fatalities. As a matter of fact, the multi criteria decision aiding methods find naturally their place since we are dealing with a ranking problem within a hierarchical structure of criteria. Therefore, in this case study, the ARAS-H method  is used to rank the Tunisian governments in terms of the number of accidents. Given that the ARAS-H method requires criteria weights in order to be implemented, we propose a procedure to elicit ARAS-H criteria weights through mathematical programming. Thus, we develop a set of linear programs which aim to infer ARAS-H criteria weights at each level of the hierarchy tree taking into account the decision maker’s (DM) preferences.
14:50 Evaluating environmental quality in Tunisia using Fuzzy CODAS SORT method
COmbinative Distance-based Assessment (CODAS) method, which is relatively a new MCDM technique, aims to select the alternative having the largest Euclidean and Taxicab (Hamming) distances to the negative-ideal point. CODAS-Sort, a recently introduced sorting variant, uses crisp class-assignment of alternatives. This can sometimes be misleading, especially for alternatives near the border of two classes. This paper aims at making the class assignment process in CODAS-Sort more flexible by using fuzzy sets theory, it can facilitates, soft transitions between classes and provide additional information about the membership of alternatives in each class that can be used to fine-tune actions beyond the crisp sorting process. This essentially complements the ordinal information of its crisp variant with cardinal information as the degree of membership of an alternative to each class. We use linguistic variables and trapezoidal fuzzy numbers to extend the CODAS method. The applicability of the proposed approach is illustrated in a case study that regards the classification of Tunisian towns according to their quality of environment.
15:10 Decision Support Using Simulation to Improve Productivity: A case study
The use of modeling and simulation as a decision aid tool has been widely recognized in the field of production engineering. In this work, modeling and simulation is used to examine, analyze, and recommend improvements for the performance of an existing assembly line within a manufacturing facility. Several key performance indicators (KPIs) are determined to evaluate the performance of the assembly line under study. With the aid of Maynard Operation Sequence Technique (MOST) and Line Balancing methods, two improved systems are recommended to enhance the performance of the line under study. Simulation of the two recommended systems show an enhanced performance of the assembly line.
15:30 Optimistic disclosure tone in corporate annual reporting and financial performance
The paper investigates the association between optimistic disclosure tone and corporate financial performance using a sample of listed firms in Gulf Cooperation Council (GCC) countries covering the period from 2012-2018 with a total of 779 observations. The optimistic disclosure tone is measured through DICTION 7.0 software. While, financial performance of firms is measured by return on equity, in addition, seven control variables. The paper found that firm performance has a positive association with the reported optimistic tone. This finding supports the agency and management impression theories.
S7B: Decision Aid in Logistics and Engineering 9
14:30 A two stage approach for off-price retailers selection
This research proposes a decision-making approach to help managers selecting the off-price retailers (discounters and clearance wholesalers) and to allocate the unsold inventory. The proposed approach is structured into three main steps. First, fuzzy analytic hierarchy process method is used to assign a global importance weight based on the strategy of the company regarding business risk and performance. Second, the calculated scores from the fuzzy analytic hierarchy process method are used to select a short-list of off-price retailers. Third, the two-combined risk and performance scores for each off-price retailer from the short-list are then added to the total profit and other criteria to select the best ones and allocate unsold inventory using multi-objective optimization. Results from a numerical example show the merit of the proposed approach in terms of effectiveness.
14:50 An Investigation into the Contributing Factors of Excess Inventory within The Cosmetic Industry in the UAE: An AHP analysis
Vian S Ahmed (Amrcican University of Sharjah, United Kingdom (Great Britain)); Sara Saboor, Heba Khlaif, Ahmed Al Suwaidi, Dana Yazbak and Ameera Khan (American University of Sharjah, United Arab Emirates)
Supply Chain Management (SCM) has become a significant part of the operational strategy in many organizations as they consistently thrive in retaining their market share. Thus, organizations constantly reassess their supply chains in order to reduce costs wherever viable while promoting sustainability to abide by the Green Supply Chain Management initiative. However, one of the main factors in any supply chain that opposes both these key concepts is the Excess Inventory. Retaining of Excess Inventory in a supply chain contributes to carrying costs of inventory and harmful environmental impacts. Despite all the efforts to promote sustainability in the UAE, not all organizations have awareness of the following issue. A particular pressing industry that should be assessed is the cosmetics industry given that the UAE was considered the highest consumer of cosmetics in the year 2008. This study, therefore, identifies the major contributors of Excess Inventory in the cosmetic industry in the UAE. The paper also proposes a set of recommendations in order to overcome the challenges faced with managing Excess Inventory. The implication of this research is to assist the cosmetic organizations, in the UAE, to set a benchmark to prioritize the major contributors of excess inventory.
15:10 A Path Planning Optimization Algorithm Based on Particle Swarm Optimization for UAVs for Bird Monitoring and Repelling – Simulation Results
Ricardo Jorge Mendes Mesquita (University of Beira Interior & C-MAST – Centre for Mechanical and Aerospace Science and Technologies, Portugal); Pedro D Gaspar (University of Beira Interior & C-MAST – Center Mechanical and Aerospace Science and Technologies, Portugal)
Bird damage to orchards causes large monetary losses to farmers. The application of traditional methods such as bird cannons and tree netting became inefficient in the long run, along with its high maintenance and reduced mobility. Due to their versatility, Unmanned Aerial Vehicles (UAVs) can be very useful to solve this problem. However, due to their low autonomy it is necessary to evolve flight planning. In this article, an optimization algorithm for path planning of UAVs based on Particle Swarm Optimization (PSO) is presented. This technique was used due to the need of an entry optimization algorithm to start the initial tests. The PSO algorithm is a simple and has few control parameters while maintaining a good performance. This path planning optimization algorithm aims to manage the drone’s distance and flight time, applying optimization and randomness techniques, to be able to overcome the disadvantages of other systems. The performance of the proposed algorithm was tested in a tree case simulation that represents all the possible cases.
15:30 Group Decision Model for Logistic Performance Analysis: a parallel between contracting and outsourced companies
Business logistics have an undeniable strategic importance and organizations can reach new markets and improve service levels for their customers by outsourcing. To maximize the benefits of the relationship between a shipper (i.e. logistics buyer) and a logistics provider, an alignment of goals and objectives is required. Thus, this paper developed a model that connects the perceptions of both parties. The model was based on the trade-off matrix, the priority matrix, and the importance-performance analysis matrix. A case study was adopted, considering two companies located in Caruaru-PE, Brazil. This case study highlights the divergent perceptions between the companies studied and how the proposed methodology can help in directing actions that create synergy in the supply chain.
S7C: Computerized Decision Aid 7
14:30 Automatically Navigating Protein Interaction Networks with a Software Product Line Approach
Daniel-Jesus Munoz (Universidad de Malaga & Instituto de Tecnologia e Ingenieria del Software (ITIS), Spain); Dina Medina-Vera (Instituto de Investigación Biomédica de Málaga-IBIMA, Universidad de Málaga & Hospital Regional Universitario de Málaga, Spain)
Software Product Line Engineering (SPLE) would categorise protein-protein interaction (PPI) networks as highly configurable systems, and the main issue with those is the impracticability to manually analyse all network/system scenarios. SPLE solves this issue by means of automated reasoners – tools that analyse every solution of an SPLE system. Hence, if we apply and SPLE approach to analyse PPI networks, we can automatically navigate through every possible PPI pathway, including uncovering new ones with regards to the current literature, providing computer decision aid to both, bio-practitioners and researchers. While PPI networks are represented by the standard Systems Biological Graphical Notation (SBGN), product lines are represented by Variability Models (VMs). We present an approach where SBGN diagrams are transformed to SPLE VMs, providing compatibility between PPI networks and automated reasoners. We conjecture that protein artefacts are, in essence, variability features, and the signalling interactions are first-order logic relationships with certain cardinalities. We then analyse a reduced cell PPI (i.e., the Akt pathway) case study represented as a Clafer model with 9 features and 3 cross-tree constraints. The model is analysed with the chocosolver reasoner – the state-of-the-art multi-purpose and automated reasoner. The evaluation uncovered 84 pathways and was carried out in less than a millisecond using a RaspberryPI 3B+. We demonstrated how can be beneficial in biomedical research by supporting the creation, update, and re-usability of PPI network VMs.
14:50 LSTM Based Approach for Classifying Twitter Posts for Movie Success Prediction
Social media like Twitter contains rich information about people’s preferences. There is a struggle to determine how to effectively utilize and interpret those data. Models are created using these large quantities of data to predict the behavior and tendencies. People share their thoughts about movies on Twitter. The movie industry has been a very important sector in the global market. So, it is important to maximize the profit by predicting the success of the movie. In this study, we proposed an approach to classify twitter posts predict the success of movies using Long short-term memory based approach. Our approach outperformed the existing approach by obtaining 83.97\% accuracy.
15:10 Soft Sensor for Online Prediction of Cement Fineness in Ball Mill
Karina Andreatta (Federal Institute of Espírito Santo (IFES) & Densyx Soluções, Brazil); Filipe Apóstolo (Densyx Soluções, Brazil); Reginaldo Barbosa Nunes (Federal Institute of Espirito Santo & Federal University of Espirito Santo, Brazil)
This paper describes the design and implementation of a soft sensor based on backpropagation neural network model to predict the cement fineness online in a ball mill. The input variables of these models were selected by studying the cement grinding process. The fineness results of laboratory tests were collected to obtain the output variable. This paper introduces a procedure for the extraction, analysis, treatment, and cleaning of raw data received from the factory, which provided a low prediction error. Estimating this variable in real-time can be extremely useful for maintaining the desired fineness during the cement grinding process, which will also allow a significant increase in the system energy efficiency. The model that presented the highest performance and the ability to predict fineness was selected to implement as a soft sensor. The developed system was tested in a cement mill grinding process, and good results were achieved, demonstrating the ability to provide information about the variables previously obtained only through offline laboratory tests.
15:30 An approach to automatically measure and visualize class cohesion in object-oriented systems
Developing systems with high quality is the motto of software engineering. In object-oriented systems, class cohesion is a significant quality attribute that impacts other quality attributes such as maintainability and reusability. Much time and effort are needed to be spent by the software engineers and developers in order to measure cohesion. Measuring cohesion automatically aids in getting over the manual measurement problem. In this paper, we propose an approach to automatically measure and visualize the cohesion within classes of object-oriented programs. The generated representations give an overall realization of the cohesion in an effective and interactive way. The approach parses the program source code, using an existing tool, into an XML file, and extracts the class tokens according to the definitions of the cohesion metrics. Then, it determines the cohesion relationships through matching these tokens with some class features. Finally, it generates interactive visualizations of the cohesion utilizing several charts. The proposed approach has been validated by conducting a case study. The results showed that the generated visualizations provide a comprehensive recognition of the program cohesion, and facilitate a proper estimation of the software quality based on its cohesion degree.
S7D: Computerized Decision Aid 8
14:30 Friend Recommendation Decision Systems via Multiple Social Network Alignment
Today, almost all internet users have more than one social network account on different social networks for interaction with friends and other users. Gathering data from various networks to combined into a single node can be used for increasing the success rate of recommendation systems. In this study, data related to thousands of users in nine different social networks are used for successful recommendations to the users. The anchor method is used for topological alignment, and the relationship between nodes is taken into account for calculation. Also, the node similarity method is used to increase the success rate. In this method, the number of successful node matching is increased thanks to the feature selection criteria. An original node alignment and node similarity methods are proposed in the study. Because of combine both node alignment and node similarity method, the proposed method is very successful for the friend recommendation.
14:56 Neural Network-Support Vector Machine for Sinusitis Classification
Sinusitis is a common health problem, especially in childhood and adolescence, caused by the inflammation of the air-filled sinus cavities in the skull. In the early stages, sinusitis does not present any symptoms from the findings and diagnoses in the health sector. Therefore, this research used a machine learning method by applying different forms such as the Neural Network-Support Vector Machine. Neural Network is a popular method with wide range of applications, and known for its high accuracy value. Moreover, there is a Support Vector Machines with several kernel functions that is commonly used in the classification of diseases, and also known for its high accuracy value. This research discusses the combination of Neural Network-Support Vector Machine as classifier for the categorization of sinusitis. Therefore, the combination of NN-SVM with several linear kernels function can be compared as classifier and used by health sector for efficient diagnosis.
15:23 Measuring the Accuracy of SVM with Varying Kernel Function for Classification of Indonesian Wayang on Images
Support vector machine (SVM) is a method that is often used in various studies to do pattern recognition of objects in the form of images. SVM is also a classification technique which is a characteristic of carrying out the training process and testing process. In classifying with data in the form of images, SVM is assisted by a feature extract technique where the process normalizes data so that a good training process can be carried out. however, SVM generally uses a linear kernel function in the testing phase. So that there is interest in this paper by designing a thought to make comparisons with several other kernel function techniques such as the cubic kernel function and the quadratic kernel function in classifying the Indonesian Wayang images which are the legacy of Indonesian ancestors. The results of this paper by varying the kernel function in SVM have an accuracy of the cubic kernel function of 83.4%.
S7E: Decision Making Using IoT and ML 10
14:30 Random Net Implementation of MLP and LSTMs Using Averaging Ensembles of Deep Learning Models
Vinayak Ashok Bharadi (Mumbai University & Finolex Academy of Management and Technology, India)
Long Short Term Memory Networks are a special type of recurrent Neural Network having better performance. LSTMs are widely used for sequence prediction. The multilayer perceptron (MLP) provides a nonlinear mapping between an input vector and a corresponding output vector. MLPs are a popular class of feedforward neural networks used for binary as well as multi-class classification. Ensemble learning is a technique by which the variance in the performance of a deep learning model is reduced by combining learning from various models for a combination of different training data as well as other parameters. This paper is discussing an experiment where this mechanism is implemented for binary classification using MLP and LSTM sequence prediction. A random Net implementation using an ensemble of MLPs as well as LSTMs is discussed here. The network is trained for a random length of the training sequence and the performance of single and ensemble classifiers is compared. The results show that the ensemble classifier is more robust and has a better stability.
14:50 IoT Driven Resiliency with Artificial Intelligence, Machine Learning and Analytics for Digital Transformation
Sumera Ahmad (Universiti Teknologi Malaysia & Malaysia, Malaysia)
A new manufacturing era “Industry 4.0” is emerging with two unique characteristics: intelligent manufacturing and integrated manufacturing. This pattern rationale with the progress of digital transformation, in which efficient manufacturing and production systems is being continuously pursued. Digital transformation initiatives generate large data sets due to massive integration of devices in internet of things (IoT) environment. This scenario demands the fastest insights to respond on time considering three key pillars: communication network evolution, digital business, and customer experience. IoT driven resiliency with traditional analytics has limited value without artificial intelligence (AI), and machine learning (ML). This study aims to explore this phenomenon of interest by conducting group discussion with software vendors. The results will helpful to utilize the power of AI and ML with analytics to leverage a large amount of data which would contribute to the success of digital transformation of organizations with real-time decision-making.
15:10 Hybrid encryption protocol for RFID Data Security
Radio Frequency Identification (RFID) is an evolving technology. It brings benefits through high productivity and efficiency in applications where artifacts have to be automatically identified. Point of Sale system, library system, and the number of applications available with this RFID technology. As security and privacy challenges are increasing day by day, RFID technology should be improved with its strategies and tactics. Asymmetric key cryptography and symmetric key cryptography separately provided good solutions to such security challenges. The propose hybrid encryption method that is used for encryption and decryption of the RFID data. The hybrid encryption protocol performs both asymmetric and symmetric ciphers. And the proposed an advanced encryption standard (AES) for symmetric cipher as it is proved to be highly secured, fast, and well-standardized, and well supported. The proposed conjoin elliptic curve cryptography (ECC) as asymmetric cipher because it is the latest and best cipher algorithm that uses smaller keys and signatures. It provides a rapid key generation, rapid key agreement, and rapid signatures. This combination will be the best for RFID technology than receiving results from both separately. For security analysis of the hybrid protocol, Automation Validation of Internet Security Protocols and Applications (AVISPA) tool is used. The goal is to propose, hybrid encryption protocol using the AES and ECC algorithms to enhance the RFID data security with the most effective and efficient methods.
15:30 YOLO v4 Based Human Detection System Using Aerial Thermal Imaging for UAV Based Surveillance Applications
Prashanth Kannadaguli (Dhaarini Academy of Technical Education, India)
This work is related to building a Human Detection system based on You Only Look Once (YOLO) v4. It is one of the most recent Deep Learning approaches primitively built using single shot detection proposal. Unlike the double stage region-based object detection schemes this technique do not follow semantic segmentation, it does not undergo loss of the object information such as disappearance of the gradients and it does not require pre-defined anchors. This technique comprises strong feature extractors and reinforce multi scale object detection and it is very quick in the multi-threaded GPU environments. Since our fundamental research is concentrated on object classification related to Unmanned Aerial Vehicle (UAV) applications, as a first step we choose to detect the humans from thermal dataset. Therefore, we used thermal images and videos possessed from thermal cameras of UAV 1m to 50m above ground level as our dataset in building the model and testing. The YOLO v4 uses ground truth bounding boxes to extract the features like Weighted Residual Connections (WRC), Cross Stage Partial Connections (CSP), Cross mini Batch Normalization (CmBN), Self-Adversarial Training (SAT), Mish Activation (MA), Mosaic Data Augmentation (MDA) and Drop Block Regularization (DBR). Finally, the performance analysis of these model in terms of mean Average Precision (mAP) indicates that the modelling using YOLO v4 performs in a promising way and it can be used in automatic human detection systems.
S7F: Economics and Investment Decision Aid 2
14:30 Prediction of Real Estate Land Prices in the Kingdom of Bahrain
Data Mining plays an important role in prediction model development and knowledge discovery. This paper follows the Cross-Industry Standard Process for Data Mining in order to develop a prediction model for the real estate land prices in different cities and villages within the Kingdom of Bahrain. The data has been collected from different sources which include social media networks, official newspapers, and advertisement newspapers, and then it has been analyzed using WEKA data mining tool for model development and analysis. The objective of this research paper is to use data mining to help the real estate advisors and consumers in the pricing estimations of the real estate land prices, and to cope with any changes in the land prices according to the market trends. The research study considers a number of factors which affect the land price, such as: land length and width, location, and land classification. Linear Regression is used for analysis. The study shows that the model is affected by the variance of the land price between different areas within the Kingdome, and therefore a number of enhancements have been suggested for future development.
14:56 What motivates Shariah compliant companies in applying accounting conservatism?
This study seeks to investigate the potential of corporate governance attributes (board independence, board leadership, ownership concentration, and audit committee) and bankruptcy prediction in the use of accounting conservatism by shariah compliant companies listed at Amana Income Investor as the first Islamic index in the world. The samples consist of the Top 10 Companies listed at Amana Income Investor from 2010- 2019. The 98 total observation data comprising of annual reports of the selected companies are analyzed using multiple linear regression. The findings indicate that some of corporate governance attributes like board independence, board leadership, and ownership concentration have no association with accounting conservatism. By contrast, audit committee and bankruptcy prediction have a significant effect accounting conservatism. This study implies that financial reporting environment for Shariah compliant businesses is relatively unique and indicate a support for a dedicated accounting principles and financial accounting standards for Shariah compliant businesses.
15:23 Embracing of Fintech in Islamic Finance in the post COVID era
Mustafa Raza Rabbani (Kingdom University, Bahrain); Yomna Abdulla (University of Bahrain, Bahrain); Abu Bashar (IMS Unison University, India); Shahnawaz Khan (University COllege of Bahrain, Bahrain); Mahmood Ali (University of Bahrain, Bahrain)
The novel corona virus (COVID-19) is a phenomenon with the aftereffects of this pandemic can be felt in the next few years to come. Economic consequences of the pandemic are huge on the Islamic finance industry, it is also evident from the slow growth forecast by the various agencies for the industry. In this paper, we consider this pandemic as an opportunity for the Islamic finance industry to grow and prove its worth again after the global financial crisis of 2008 and emerge as a major contender to the conventional financial system. We develop a model which reveals that COVID-19 is an opportunity with more integrated and transformative growth with high level of standardization, with key focus of the industry on the social cause and tactical adoption of Financial technology. The paper has an implication to the Islamic finance and banking industry as it provides a framework for the future researchers and practitioners to understand and adopt Islamic finance in the post COVID-19 era. Keywords- Financial contagion; Sustainable finance; Post-COVID-19; Islamic Finance; FinTech
S7G: Decision Making Using IoT and ML 11
14:30 Managing Big Data using Model Driven Engineering: From Big Data Meta-model to Cloudera PSM meta-model
In the era of Big Data, all our actions generate digital traces. A huge amount of data is generated daily. They are extremely related to personal life and they can be exploited in different fields. There is also internally generated data known as offline data that is created by the operations of organizations. The processing of this big data plays a key role in decision making. After the definition of generic meta-model in previous work for the processing layer. This paper presents a shift from the generic processing layer meta-model within Big Data (PIM) to a processing layer meta-model for the Cloudera distribution (PSM) through the use of the ATL transformation language. The result of the ATL transformation represents the PSM (Platform-Specific Models) according to the architecture of the MDA.
14:50 Classification of Binary Class HIV/AIDS Test Results Using Ensemble Learning Models
Daniel M Belete (Mangalore University, India)
In HIV/AIDS datasets, there is lack of adequate and imbalanced samples includes repetitive and unnecessary features that cause high dimensionality spaces. To address this problem, selections of features are examined. In this research, we propose an ensemble learning method for the classification of a binary class of HIV/AIDS test results. The backward feature selection (BFS) of the wrapper method is used for feature selection. Five established classifiers are used, namely Gradient Boosting (GB), Multilayer Perceptron (MLP), Random Forest (RF), Extra Tree (ET), and K-nearest neighbor (KNN). For preparation and testing of the model, 10-fold cross-validation is applied. Experiments are carried out on the EDHS-HIV/AIDS dataset. Various performance measurements are used to assess the model’s performance. The confusion matrix is used to demonstrate whether the samples are labeled correctly or not. Based on performance evaluation parameters, a review of the results of each classifier is provided. Significant performance enhancements are seen in the results when feature selection is considered to be better on the original dataset comparison to the selected features for all classifier outputs.
15:10 A Two Tier Iterative Ensemble Method To Tackle Imbalance In Multiclass Classification
Sridhar S (SRM Institute Of Science and Technology, India)
In real world applications, it is very common that the data skewness occurs among multiple classes. Several studies and various attempts were made in the past to overcome this imbalance problem which is a serious issue to the standard machine learning techniques especially classification and regression but, still there exists a need to handle the imbalance problem effectively. Datasets which are imbalanced generally include safe and unsafe minority samples. Our proposed approach is a classifier independent two tier iterative ensemble approach which focuses the rare minority sample’s influence on learning from imbalanced datasets. Most of the informed oversampling techniques like SMOTE and its variants cannot be applied directly on rare class samples especially when the count of rare samples is too low. To alleviate this problem, in our proposed approach to learn from rare and outlying samples we proposed a hybrid oversampling technique used at different levels and make them balanced. The goal is to tone down the data imbalance at the data preprocessing stage itself by correcting or balancing the training data sets before moving to the learning part which makes the classifier to focus on its primary role and thereby it improves the learning process. The proposed two tier iterative ensemble approach shows a much significant improvement in the learning process among the multiclass imbalanced data which is clearly evident with the experimental results.
15:30 IoT Based Remote Patient Monitoring System
Sidra Maqbool (Superior University Lahore, Pakistan); Muhammad Waseem Iqbal (The Superior College, Lahore, Pakistan); Muhammad Raza Naqvi and Khawaja Sarmad Arif (The Superior College Lahore, Pakistan); Muhammad Ahmed (The Superior College, Pakistan); Muhammad Arif (The Superior University, Lahore, Pakistan)
Remote Patient Monitoring is one of the most demanding studies these days in IoT healthcare applications. As growing many things in IoT Healthcare Applications is very useful for people but in rural areas peoples faced problems to get professional services of healthcare due to lack doctors, hospitals, and long distance from city. In this situation to over come issues of health Remote Patient Monitoring is best solution. In this paper we proposed an IoT based real time patient monitoring system which is using Message Queuing Telemetry Transport (MQTT) for messaging also used to transmit Electrocardiogram ECG from proposed app to web server with real time data. Doctor can access ECG data using his smart phone or desktop computer. Proposed app has been tested in public and private (LAN and WAN) both network environments. The Proposed app result is showing there is no loss of data package and error of packets in both LAN and WAN networks.
S7H: Financial Decision Making 3
14:30 Payout Decision pre- and during COVID-19: Evidence from Bahrain
There are several significant decisions taken during the lifespan of any firm, payout decision is no doubt a critical one among them. In this paper, we first examine the payout decision and dividend smoothing of firms listed on Bahrain Bourse during the period 2017-2019. Second, we investigate whether the payout decision of Bahraini firms has been affected by COVID-19 by utilizing data from the first two quarters of 2020. The findings show that non-financial firms have a higher percentage of dividends payers and smooth dividends compared to financial firms during the pre-COVID-19 period. Furthermore, the payout decisions of both types of firms were relatively affected by COVID-19.
14:50 Macroeconomic determinants of nonperforming loans: An empirical evidence from GCC countries
The current study aims to investigate the macroeconomic determinants of nonperforming loans (NPLs) in the GCC economies during the period spanning 2000 to 2018. A generalized method of moments is used to identify the factors that play imperative role in determining NPLs. The findings reveal that deterioration of macroeconomic factors and economic shock increase banks credit risk. Thus, the policymakers and authorities must strive to maintain a healthy economy and apply macroprudential regulations to enhance the banks financial stability and minimize credit risk.
15:10 The asymmetric association among the consumption of gasoline, financial advancement, and economic freedom in singapore
This paper explores the impact of four main factors influencing gasoline consumption, namely financial expansion, economic freedom, expansion of the economy, and gasoline prices in the case of Singapore for the period 1996-2017. The asymmetric NARDL approach was applied. The findings revealed that financial advancement, expansion of the economy, and economic freedom significantly reduce gasoline consumption in Singapore. Furthermore, the causality test revealed that there is a unidirectional causality relationship extending from financial expansion to the consumption of gasoline, while there is a bi-directional causality among economic freedom and the consumption of gasoline.
15:30 Levels of Financial Inclusion in the WAEMU Countries: A case study using DEA
In this study, we propose a novel methodological approach to evaluate the levels of financial inclusion of countries using an aggregate multidimensional index based on Data Envelopment Analysis (DEA). The approach is applied to countries of the West African Economic and Monetary Union (WAEMU), using the financial inclusion definition and indicators defined by the Central Bank of the West African States (BCEAO). We observed that all the eight countries had increased their levels of inclusion between 2010 and 2017 steadily. The ensuing benchmarking indicates that Senegal is the leader of the Union when it comes to financial inclusion. Further, Senegal, Benin, Ivory Coast, and Togo are the benchmark countries. For the four remaining countries, we identify reference-countries that they should emulate to increase their financial inclusion levels
Plenary 3: Plenary Session
Closing: Closing Ceremony