11th International Workshop

Data Analysis Methods for Software Systems
XI DAMSS
Druskininkai, Lithuania,
Hotel „Europa Royale“,
http://www.mii.vu.lt/DAMSS



DAMSS 2018: Plenary Speakers



Assoc. Prof. Ana Camanho

Associate Professor at the Faculty of Engineering of the University of Porto, Portugal (FEUP). She is the Director of the Master of Science in Industrial Engineering and Management at FEUP. She has a Ph.D. in Industrial and Business Studies from Warwick Business School, University of Warwick, United Kingdom. Her research interests include the assessment of performance and evaluation of productivity change over time, focusing primarily on the use of the Data Envelopment Analysis technique. She has been involved in research projects, both at the national and international level, in the following sectors: banking, retailing, healthcare, education, fisheries, construction industry, regulation of electricity distribution companies, water utilities, mining industry, transportation systems and urban quality of life. She was vice-president of the Portuguese Operational Research Society and Pro-director of the Ph.D. program in Industrial Engineering and Management of the School of Engineering of the University of Porto. She has published more than 60 articles in international peer-reviewed journals. She has got more than 1300 citations to date. She has been a member of the programme and organizing committees of several national and international conferences.

Talk title: Benchmarking Secondary Schools Using Students’ Results at University

Abstract: In this research, we use data on university students’ first-year scores as a means to compare and benchmark secondary schools on their ability to lead students to university success. Data from two universities in Portugal, Portuguese Catholic University (with seven faculties) and the University of Porto (with 14 faculties) were used to compute a number of metrics, based on which secondary schools were compared. The performance of more than 10.000 students from 65 different degrees was explored for a three year period.

There are mainly two research questions addressed in this paper: (1) what are the determinants of success for first-year university students? and (2) what is the role of the secondary school of origin in determining success for first-year university students? The regression model used to test these hypothesis assumes that university success in the first year is a function of student characteristics (e.g., scores on entry to higher education), secondary school characteristics (e.g., private vs. public) and university/course characteristics (e.g., public vs private university and the main scientific area of the course). University success is measured through two measures: number of ECTS finished by first-year university students and the final grade at the end of the first year (this final grade is normalised per degree attended and cohort to allow comparability between different degrees and years). Conclusions from the preliminary analysis point to the importance of scores on entry (based on standardized national exams), but this importance is relatively small in explaining the success of students at the university (it explains to a certain degree the average scores obtained at the end of the first year but not the number of ECTS completed). The type of school appears as the second most important factor in determining student’s success, with public schools performing better than private.

A DEA model was constructed to compare secondary schools on the extent to which they prepare students to university success. For this purpose, a set of outputs relating to success in the first year of higher education was defined (e.g., percentage of students at the top percentile of the degree attended, average number of ECTS completed by the students, and average score of students at the end of the first year, normalised according to the degree attended). The input of the DEA model is the score on the entry of the students, normalised according to the degree attended by each student. This model attempts to evaluate secondary schools by the value added accomplished by their students when attending higher education. This model is unprecedented in the literature, as it evaluates secondary schools based on the performance of students in higher education, which is undisputedly a key objective of secondary school education.



Academician Florin Gheorghe Filip

Acad. Florin Gheorghe Filip was born on 25 of July 1947. He became corresponding member of the Romanian Academy (in 1991, when he was only 44 years old), and at 52 years old (1999) become full member in the highest cultural and scientifical forum of Romania. For 10 years, during 2000-2010, he was Vice-president of the Romanian Academy( the National Academy of Sciences) and in 2010, he was elected President of the 14th Section “Information Science and Technology” of the Academy ( re-elected in 2015). He was the managing director of National Institute for R&D in Informatics-ICI (1991-1997). His main scientific interests are: optimization and control of complex systems, decision support systems, technology management and foresight, and IT applications in the cultural sector. He authored/coauthored over 300 papers published in international journals (IFAC J Automatica, IFAC J Control Engineering Practice, Annual Reviews in Control, Computers in Industry, System Analysis Modeling Simulation, Large Scale Systems, Technological and Economical Development of Economy and so on) and contributed volumes printed by international publishing houses (Pergamon Press, North Holland, Elsevier, Kluwer, Chapman & Hall etc). He is also the author/coauthor of thirteen monographs (published by Editura Tehnica, Bucuresti, Hermes-Lavoisier Paris, J. Wiley & Sons, London, Springer) and editor/co-editor of 24 volumes of contributions (published by Editura Academiei Romane, Elsevier Science, Institute of Physics. Melville, USA, IEEE Computer Society, Los Alamitos, USA). He was an IPC member of more than 50 international conferences held in Europe, USA, South America, Asia and Africa and gave plenary papers at scientific conferences held in Brazil, Chile, China, France, Germany, Lithuania, Poland, Portugal, Rep. of Moldova, Spain, Sweden, Tunisia, UK. F.G Filip was the chairman of IFAC (International Federation of Automatic Control) Technical Committee “ Large Scale Complex Systems”( 1991-1997). He is founder and Editor-in-Chief of Studies in Informatics and Control journal (1991), co-founder and Editor-in-Chief of International Journal of Computers Communications & Control (2006). He has received Doctor Honoris Causa title from "Lucian Blaga" University of Sibiu (2000), "Valahia" University, Targoviste (2007), "Ovidius" University, Constanta (2007), Ecolle Centrale de Lille (France) (2007), Technical University ”Traian Vuia”. Timisoara (2009), “Agora” University of Oradea (2012), Academy of Economic Studies, Bucharest (2014), University of Pitesti (2017), and “Petrol-Gaz “University of Ploiesti( 2017). He is an honorary member of the Academy of Sciences of Republic of Moldova ( 2007) and Romanian Academy of Technical Sciences (2007).

Talk title: DSS-An Evolving Class of Information Systems

Abstract: The talk starts with the presentation of basic concepts of a particular class of information systems, namely DSS (Decision Support Systems) which are meant to help the decision –maker to solve complex decision problems that count. Various classifications made in accordance with specific criteria such as: type of support, number of users, decision-maker type and influence level, technological orientation, are described. The evolution trends are presented with a particular emphasis on modern I&C (Information and Communication) technologies utilized and business models adopted. A set o design criteria and the associated multi-participant decision making the practical method to choose an adequate solution are discussed.



Prof. G Ester Martín Garzón

She is a full professor in the knowledge area of Computer Architecture and Technology. Her research has been focused on the High Performance Computing and sparse matrix computation. Since 2009 her interest has been focused on GPU computing and heterogeneous computational platforms. She has applied these techniques in the following lines: (1) HPC libraries for the solution of sparse systems; (2) image processing in electron microscopy; (3) tomographic reconstruction based on holographic data; (4) anomaly detection in hyperspectral images; (5) optimization algorithms characterized by their irregularity; (6) simulation of models in microreology; (7) Techniques for tuning energy consumption and performance on heterogeneous computation. Her contributions to the development of libraries to accelerate sparse-matrix operations on GPUs are remarkable. Her participation in the development of software for tomographic reconstruction on GPUs is also noteworthy. His research has been funded for the past 10 years through her participation in four national projects; four regional projects, as well as two European Cost shares and two thematic networks: e-science and CAPAP-H. It is remarkable her participation in the Supercomputing-Algorithms research group at Almeria University, initially as a researcher and currently as the group leader.

Talk title: High Performance Computing: Platforms and Techniques

Abstract: It is of utmost importance to overcome the challenges of the High Performance Computing (HPC) to advance in the main topics of the modern computation, such as Internet of things, Data Mining, Artificial Intelligence and so on, since they have tremendous computational requirements. Supercomputers provide great computational resources, which can be harnessed by Cloud technology. Moreover, thanks to technological and architectural advances, current HPC platforms are not only located in Supercomputing Centers, but also standalone computers, smartphones, embedded systems can be considered as HPC platforms if they are appropriately exploited. These architectures are based on heterogeneous multi-core processors and they include additional resources to take advantage of different kinds of parallelism in applications. This tutorial revises the keys of the current HPC platforms from the point of view of the hardware technology and software interfaces, with a special focus on the representative HPC applications.




Academician Janusz Kacprzyk

Academician Janusz Kacprzyk, Fellow of IEEE, IET, IFSA, EurAI and SMIA, graduated from Warsaw University of Technology, Poland, with M.Sc. in automatic control and computer science, obtained in 1977 Ph.D. in systems analysis and in 1991 D.Sc. in computer science. He is Professor of Computer Science at the Systems Research Institute, Polish Academy of Sciences, WIT – Warsaw School of Information Technology, Chongqing Three Gorges University, Wanzhou, China, and Professor of Automatic Control at PIAP – Industrial Institute of Automation and Measurements. He is Honorary Foreign Professor at the Department of Mathematics, Yli Normal University, Xinjiang, China. He is Full Member of the Polish Academy of Sciences, Member of Academia Eureopaea (Informatics), Member of European Academy of Sciences and Arts (Technical Sciences), Foreign Member of the Bulgarian Academy of Sciences, Spanish Royal Academy of Economic and Financial Sciences (RACEF), and the Finnish Society of Sciences and Letters. He received the honorary doctorates (doctor honoris causa) from Széchenyi István University, Győr, Hungary, 2014; Óbuda University, Budapest, Hungary, 2016; Professor Zlatarev University, Bourgas, Bulgaria, 2017; and Lappeenranta University of Technology, Lappeenranta, Finland, 2017. He has been a frequent visiting professor in the USA, Italy, UK, Mexico, China, Austria, and Bulgaria and served on evaluation commissions of many foreign universities, as well as on panels for Advanced Grants at the European Research Council. His main research interests include the use of modern computation computational and artificial intelligence tools, notably fuzzy logic, in decisions, optimization, control, data analysis and data mining, with applications in databases, ICT, mobile robotics, systems modeling etc. He authored 6 books, (co)edited more than 100 volumes, (co)authored ca. 550 papers, including ca. 80 in journals indexed by the WoS. His bibliographic data are: due to Google Scholar - citations: 22811; h-index: 68, due to Scopus: citations: 6856; h-index: 38; due to WoS: citation: 5297, h-index: 32. He is the editor in chief of 6 book series at Springer, and of 2 journals, and is on the editorial boards of ca. 40 journals. He is a member of the IEEE CIS Award Committee, was Chair of 2016 IEEE CIS Award Committee, was in 2011 – 2016, and from 2018 a member of Adcom of IEEE CIS, and was a Distinguished Lecturer of IEEE CIS. He received many awards: 2006 IEEE CIS Pioneer Award in Fuzzy Systems, 2006 Sixth Kaufmann Prize and Gold Medal for pioneering works on soft computing in economics and management, 2007 Pioneer Award of the Silicon Valley Section of IEEE CIS for contribution in granular computing and computing in words, 2010 Award of the Polish Neural Network Society for exceptional contributions to the Polish computational intelligence community, IFSA 2013 Award for his lifetime achievements in fuzzy systems and service to the fuzzy community, the 2014 World Automation Congress Lifetime Award for contributions in soft computing, the 2016 Award of the International Neural Network Society – Indian Chapter for Outstanding Contributions to Computational Intelligence, 2017 EUSFLAT (European Society for Fuzzy Logic and Technology) Honorary Membership. He is the President of the Polish Operational and Systems Research Society and Past President of International Fuzzy Systems Association.

Talk title: Decisions in Human Centric Multiagent Systems: Rationality and Some Cognitive Biases

Abstract: We are concerned with systems in which there are multiple agents, humans or software agents imitating the human judgments, behavior, perception, evaluations, etc. The agents are faced with a set of options (alternatives or courses of action) and should find an option or a set of options that can be considered as the best acceptable by the entire group. The agents provide their testimonies as to their opinions on the goodness of options which can be given as preference relations, utility (valuation) functions, sets of approved/disapproved options, etc. The best option is then found on the basis of these testimonies and also some other aspects like attitudes of the agents. The process is then assumed to be run with the help of a moderator, a “super-agent” who is guiding the process. First of all, the process is meant to involve a consensus reaching step in which the source, usually highly different testimonies of the agents are made closer, i.e. the agents change their testimonies usually persuaded by the moderator. Then, after reaching a “consensus”, i.e. closer testimonies, some group decision making (or voting) tools are used to obtain solutions which are mainly some cores, i.e. sets of options that are undominated with respect to other options in the opinion of most agents. We assume quite a general approach with fuzzy preferences and a fuzzy majority. It is shown that such an approach exhibit, first, some properties of a greedy, or selfish, approach since the agents are just take into account their testimonies. A new approach is proposed in which some fairness occurs in the sense that testimonies of all agents are accounted for. Moreover, as this fairness oriented approach involves changes of the agents’ testimonies, with a fair suggestion of such changes to all agents, a new model is proposed which takes into account some well known cognitive bias, the status quo bias. The cognitive bias is meant in the sense of Kahneman and Tversky, i.e. that people make judgments or decisions in the ways that are systematically different than obtained from traditional economic models, and the cognitive bias does not necessarily lead to bad decisions. Among the cognitive biases the status quo bias,, i.e. that people tend to avoid larger changes, is shown to be promising in the context considered. A possible use of some other cognitive biases is mentioned.



Jonas Kubilius

After completing his dual degree at MIT (USA) in mathematics and physics, Jonas obtained a Master's in artificial intelligence and a PhD in psychology at KU Leuven (Belgium). During those years, his research focused on the neural and behavioral underpinnings of human visual perception. He then returned back to MIT as a postdoc with Marie Skłodowska-Curie Global Fellowship in order to apply deep learning techniques for modeling human visual processing. Jonas is currently in his last year of the fellowship back at KU Leuven, where his main interest is extending deep learning for cognitive tasks. Jonas is also active in communicating research to the public, giving lectures at TEDxVilnius, Cafe Scientifique, National Students Academy, National Gallery of Arts and this year contributing to the Lithuanian Pavilion at the 16th International Architecture Exhibition–La Biennale di Venezia. His research has been covered by popular media and, recently, by Science magazine.

Talk title: Deep learning for understanding human vision

Abstract: How do we recognize what we see? Despite the deceptive ease of perceiving things, explaining how we see turns out to be a supremely difficult task. Only recently advances in computer vision finally brought a class of models, known as deep neural nets, that are capable of matching human and non-human primate performance in several visual perception tasks. Our present aim is to develop these artificial systems further so that they would simultenously (i) predict primate neural and behavioral responses during visual object recognition tasks, (ii) map well onto brain anatomy, and (iii) generalize to novel stimuli similarly to primates. I will first introduce Brain-Score, our composite benchmark for an extensive comparison of deep nets to primate ventral visual stream. Building on the insights gained by performing such benchmarking, I will describe the CORnet family of models that commits to biological realities of the visual cortex. I will further extend our benchmarking to a much wider image set of images, including cartoons and paintings, to test and compare the limits of generalization in humans and machines. Taken together, our approach brings forward a good baseline deep neural network that could serve as a building block towards developing capable artificial cognitive agents.



Prof. Mikhail Lavrentiev

Professor at the faculty of information technologies of the Novosibirsk State University (Russia) and deputy director of the Institute of Automation and Electrometry (Siberian Branch of the Russian Academy of Sciences). Born on September 10, 1956. Member of academic councils of: NSU, IT school, College of Informatics, Specialized High School, Institute of Automation and Electrometry SD RAS; Editorial board of the NSU Bulletin: Informatics; Member of PhD defense councils; Director of the Pottosin’s Open All-Siberian Student Programming Contest. Visiting professor at 11 universities worldwide: Freie Universität Berlin (Germany), University of Padova (Italy), Milan University (Italy), University “Roma Tre” (Italy), Institute of Global Analysis (Florence, Italy), University of Maryland at College Park (Washington DC, USA), Pacific Marine Environmental Laboratory (USA National Ocean and Atmosphere Administration, Seattle, USA), National university of Singapore, University of Wollongong (Australia), University of Tokyo (Japan), Keio University (Japan). Research focus: qualitative theory of differential equations. Direct and inverse problems. Nonlinear equations, Degenerate equations, Equations with integral terms. Applications in Geo Sciences. Mathematical simulation in chemistry. Computer graphics for PC. Since 2013 - Principal Investigator of Research contracts with the Novosibirsk Technological Center of Schlumberger for code acceleration; Contract with the International Holding Parallels (according to the Ruling 218 of the Russian Federation Government), Software Tools for Automation of Cloud Hosting; Pacific Marine Environmental Laboratory, NOAA, Seattle, DART station optimal positioning.

Talk title: HPC for Tsunami Danger Prediction

Abstract: It takes only 20 min for tsunami wave to approach dry land after earthquake offshore Japan. HPC resources are needed for Tsunami Warning Service. Several algorithms, covering major aspects of tsunami risks mitigation, were optimized to achieve near real-time performance at the modern PC, using Graphic Processing Units (GPU) and/or Field Programmable Gates Arrays (FPGA). Among the algorithms we describe: (a) optimization of the sensor network (deep water pressure recorders or/and GPS stations), (b) filtering tide wave from the measured data, obtained at deep water buoys signal, (c) estimation of initial sea faces displacement at tsunami source, (d) tsunami wave propagation throw the ocean. Performance of the NOAA tsunami simulation tool, the so-called MOST software package, become nearly 100 times faster after optimization for NVIDIA GPU architecture. As the result, we expect calculated wave amplitudes (covering area compared to 1.000*2.000 km) within 12 minutes after the earthquake.




Assoc. Prof. Paulo Novais

Associate Professor with Habilitation of Computer Science at the Department of Informatics, in the School of Engineering of the University of Minho (Portugal) and a researcher at the ALGORITMI Centre in which he is the Leader of the research group ISlab - Synthetic Intelligence, and the coordinator of the research line Computer Science and Technology (CST). He is the director of the PhD Program in Informatics and co-founder and Deputy Director of the Master in Law and Informatics at the University of Minho. He started his career developing scientific research in the field of Intelligent Systems/Artificial Intelligence (AI), namely in Knowledge Representation and Reasoning, Machine Learning and Multi-Agent Systems. His interest, in the last years, was absorbed by the different, yet closely related, concepts of Ambient Intelligence, Ambient Assisted Living, Intelligent Environments, Behavioural Analysis, Conflict Resolution and the incorporation of AI methods and techniques in these fields. His main research aim is to make systems a little more smart, intelligent and also reliable. He has led and participated in several research projects sponsored by Portuguese and European public and private Institutions and supervised several PhD and MSc students. He is the co-author of over 350 book chapters, journal papers, conference and workshop papers, and books. He is the president of APPIA (the Portuguese Association for Artificial Intelligence) since 2016, Portuguese representative at the IFIP - TC 12 - Artificial Intelligence, chair of the Working Group on Intelligent Agent (WG12.3), and member of the executive committee of the IBERAMIA (IberoAmerican Society of Artificial Intelligence). During the last years, he has served as an expert/reviewer of several institutions such as EU Commission and FCT (Portuguese agency that supports science, technology, and innovation).

Talk title: Human-Computer Interaction: A Behavioural Approach

Abstract: Academia and enterprises have been very prolific in researching human-computer interaction and the human behaviour issues through controlled studies, surveys, laboratory prototypes and applied case-studies. The results of this research efforts show that there are several key factors involved in Human-computer Interaction activities. These include physiological and psychological indicators, as well as behavioral and even somatic ones.
Stress is a critical component of the human activity, and it shows through physical, mental, or emotional tension, having a high impact in medical or biological contexts. Thus, stress has a great repercussion in psychological conditions such as depression and anxiety. In a corporate environment, workers are under increasing demand for performance, which leads to being under constant pressure. Therefore, their level of stress is greatly affected, presenting significant variations. To tackle these issues, this research aims to produce a new approach in human–computer interaction by developing a distributed multi-modal framework to monitor and assess the psychological stress of people doing high-end computer-related tasks. This is done in a non-intrusive and non-invasive way, using soft sensors to monitoring activities (e.g., task performance and human behaviour).



Asoc. Prof. Kristof De Witte

 Kristof De Witte is a tenured associate professor at the Faculty of Economics and Business at KU Leuven, Belgium, and he holds the chair in ‘Effectiveness and Efficiency of Educational Innovations’ at Top Institute for Evidence Based Education Research (TIER) at Maastricht University, the Netherlands. At KU Leuven he is director of the research group ‘Leuven Economics of Education Research’. The research interests of Kristof De Witte comprise education economics, performance evaluation, and political economy. He published his work in many leading academic journals including ‘The Economic Journal’, ‘Journal of Urban Economics’, ‘European Journal of Operational Research’, ‘Economics of Education Research’, ‘European Journal of Political Economy’ and ‘Scientometrics’. Further information on www.feb.kuleuven.be/kristof.dewitte

Talk title: The Laboratory of an Education Economist. Testing Cures for Disadvantaged Students

Abstract: Kristof De Witte overviews his research agenda on socio-economic segregation. In particular, he discusses various quasi-experimental evaluations of interventions to change the odds for disadvantaged students. On the one hand, he presents the effects of additional resources at school level on cognitive and non-cognitive outcomes and illustrates that that extra resources at municipality level only result in grade inflation. On the other hand, he argues that information shocks provided by making school quality information public changes the socio-economic composition of schools. He shows that in the longer run, these findings are alarming as the inability to break the vicious circle for low SES students leads to more school dropout, and that particularly those students have a lower return to education.



Students and researchers are invited to participate in the conference.

Each session talk is 15-20 minutes long. Poster sessions will provide an opportunity for authors to display the results and conclusions of their research. Posters will be exhibited during the workshop.
Recommended paper size of posters is A1 (84.1 cm x 59.4 cm.), language is English.