IMT Institutional Repository: No conditions. Results ordered -Date Deposited. 2024-03-28T16:19:40ZEPrintshttp://eprints.imtlucca.it/images/logowhite.pnghttp://eprints.imtlucca.it/2018-03-12T08:59:04Z2018-03-12T08:59:04Zhttp://eprints.imtlucca.it/id/eprint/4012This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/40122018-03-12T08:59:04ZModel Predictive Control of Nonholonomic Mobile Robots Without Stabilizing Constraints and CostsThe problem of steering a nonholonomic mobile robot to a desired position and orientation is considered. In this paper, a model predictive control (MPC) scheme based on tailored nonquadratic stage cost is proposed to fulfill this control task. We rigorously prove asymptotic stability while neither stabilizing constraints nor costs are used. To this end, we first design suitable maneuvers to construct bounds on the value function. Second, these bounds are exploited to determine a prediction horizon length such that the asymptotic stability of the MPC closed loop is guaranteed. Finally, numerical simulations are conducted to explain the necessity of having nonquadratic running costs.Karl WorthmannMohamed W. MehrezMario Zanonmario.zanon@imtlucca.itGeorge K. I. MannRaymond G. GosineMoritz Diehl2018-03-09T14:21:54Z2018-03-09T14:21:54Zhttp://eprints.imtlucca.it/id/eprint/4010This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/40102018-03-09T14:21:54ZPrimal decomposition of the optimal coordination of vehicles at traffic intersectionsRobert HultMario ZanonSebastien GrosPaolo Falcone2018-03-09T14:17:27Z2018-03-09T14:17:27Zhttp://eprints.imtlucca.it/id/eprint/4011This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/40112018-03-09T14:17:27ZA tracking MPC formulation that is locally equivalent to economic MPCThe stability proof for economic Model Predictive Control (MPC) is in general difficult to establish. In contrast, tracking MPC has well-established and practically applicable stability guarantees, but can yield poor closed-loop performance in terms of the selected economic criterion. In this paper, we derive a formal procedure to design a tracking MPC scheme so as to locally approximate the behaviour of economic MPC. Given an economic stage cost, the desired tracking stage cost can therefore be computed automatically. Because tracking MPC guarantees stability of the closed-loop system, our procedure succeeds if and only if economic MPC is locally stabilising. This fact can be used to certify whether economic MPC is not stabilising. We illustrate the theoretical developments in a simulated example.Mario Zanonmario.zanon@imtlucca.itSébastien GrosMoritz Diehl2018-03-09T14:15:46Z2018-03-09T14:15:46Zhttp://eprints.imtlucca.it/id/eprint/4007This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/40072018-03-09T14:15:46ZTime-optimal race car driving using an online exact hessian based nonlinear MPC algorithmThis work presents an embedded nonlinear model predictive control (NMPC) strategy for autonomous vehicles under a minimum time objective. The time-optimal control problem is stated in a path-parametric formulation such that existing reliable numerical methods for real-time nonlinear MPC can be used. Building on previous work on timeoptimal driving, we present an approach based on a sequential quadratic programming type algorithm with online propagation of second order derivatives. As an illustration of our method, we provide closed-loop simulation results based on a vehicle model identified for small-scale electric race cars.Robin VerschuerenMario Zanonmario.zanon@imtlucca.itRien QuirynenMoritz Diehl2018-03-09T13:34:41Z2018-03-09T13:34:41Zhttp://eprints.imtlucca.it/id/eprint/4034This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/40342018-03-09T13:34:41ZModularities maximization in multiplex network analysis using Many-Objective OptimizationNowadays, social network analysis receives big attention from academia, industries and governments. Some practical applications such as community detection and centrality in economic networks have become main issues in this research area. Community detection algorithm for complex network analysis is mainly accomplished by the Louvain Method that seeks to find communities by heuristically finding a partitioning with maximal modularity. Traditionally, community detection applied for a network that has homogeneous semantics, for instance indicating friend relationship between people or import-export relationships between countries etc. However we increasingly deal with more complex network and also with so-called multiplex networks. In a multiplex network the set of nodes stays the same, while there are multiple sets of edges. In the analysis we would like to identify communities, but different edge sets give rise to different modularity optimizing partitions into communities. We propose to view community detection of such multilayer networks as a many-objective optimization problem. For this apply Evolutionary Many Objective Optimization and compute the Pareto fronts between different modularity layers. Then we group the objective functions into community in order to better understand the relationship and dependence between different layers (conflict, indifference, complementarily). As a case study, we compute the Pareto fronts for model problems and for economic data sets in order to show how to find the network modularity tradeoffs between different layers.Asep MaulanaValerio GemmettoDiego Garlaschellidiego.garlaschelli@imtlucca.itIryna YevesyevaMichael Emmerich2018-03-05T16:36:01Z2018-03-05T16:36:01Zhttp://eprints.imtlucca.it/id/eprint/3953This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39532018-03-05T16:36:01ZProgramming of CAS Systems by Relying on Attribute-Based CommunicationIn most distributed systems, named connections (i.e., channels) are used as means for programming interaction between communicating partners. These kinds of connections are low level and usually totally independent of the knowledge, the status, the capabilities, ..., in one word, of the attributes of the interacting partners. We have recently introduced a calculus, called AbC, in which interactions among agents are dynamically established by taking into account “connection” as determined by predicates over agent attributes. In this paper, we present Open image in new window, a Java run-time environment that has been developed to support modeling and programming of collective adaptive systems by relying on the communication primitives of the AbC calculus. Systems are described as sets of parallel components, each component is equipped with a set of attributes and communications among components take place in an implicit multicast fashion. By means of a number of examples, we also show how opportunistic behaviors, achieved by run-time attribute updates, can be exploited to express different communication and interaction patterns and to program challenging case studies.Yehia Moustafa Abd Alrahmanyehia.abdalrahman@imtlucca.itRocco De Nicolar.denicola@imtlucca.itMichele Loreti2018-03-02T11:50:38Z2018-03-02T11:50:38Zhttp://eprints.imtlucca.it/id/eprint/3934This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39342018-03-02T11:50:38ZL'analisi per indici del bilancio di esercizioRiclassificazione e analisi di bilancio sono trattati quali strumenti a disposizione del management per il governo aziendale. Il capitolo e corredato di caso aziendale.Nicola Lattanzinicola.lattanzi@imtlucca.itGino Fontana2018-03-02T11:48:35Z2018-03-02T11:48:35Zhttp://eprints.imtlucca.it/id/eprint/3933This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39332018-03-02T11:48:35ZManagement Science for Complex Networks and Smart Water Grids: a case study in ItalyAs the effects of climate change unfold and become more visible, infrastruc- tures, especially those related to the distribution of water are the most exposed to the deep changes expected in the next years. Water is fundamental for peo- ple, and for infrastructures like energy, waste, and food production. Water sus- tainability is then a fundamental aspect to address by an efficient use of the resources and the maintenance of quality standards adopting a management science perspective. Therefore, water industry and infrastructure need a deep transformation, and we claim that this transformation is the result of a synergy between different fields or research. Our paper presents a managerial framework based on a complex systems used to reshape and optimize in different meanings the performance of the water infrastructure through the development of a case study in Italy. Our framework, called Acque 2.0 (Water 2.0) is based on these pillars: 1. The current and future scenarios for water management 2. Management science and water 3. Digitalization of water infrastructure 4. Increase the network resiliency and quality of service using complex networks 5. Use of predic- tive maintenance methods based on network simulations and big data 6. Involve utilities, regulators, policy makers, and citizens 7. Remarks and conclusion. The case study will be developed in the municipality of Viareggio, characterized by old infrastructures, seasonal variation of population, and water scarcity.Nicola Lattanzinicola.lattanzi@imtlucca.itAngelo Facchiniangelo.facchini@imtlucca.itGuido Caldarelliguido.caldarelli@imtlucca.itAntonio ScalaGiovanni Liberatore2018-03-02T11:43:10Z2018-03-02T11:43:10Zhttp://eprints.imtlucca.it/id/eprint/3932This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39322018-03-02T11:43:10ZConclusioniIl commento scientifico è conclusivo dell'intera trattazione scientifica e sofferma l'attenzione su Impatto strategico e implicazioni operative derivanti dai cambiamenti in atto nel governo strategico dell'azienda familiare.Luca AnselmiNicola Lattanzinicola.lattanzi@imtlucca.it2018-03-02T11:39:54Z2018-03-02T11:39:54Zhttp://eprints.imtlucca.it/id/eprint/3931This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39312018-03-02T11:39:54ZLa PMI a conduzione familiare di fronte agli accordi di BasileaLa PMI a conduzIone familiare e il sistema di ring bancario per la connessione di credito: impatto sulle competenze aziendali e implicazioni sui processi di comunicazione.Nicola Lattanzinicola.lattanzi@imtlucca.itLorenzo Dal Maso2018-03-02T10:24:39Z2018-03-02T10:24:39Zhttp://eprints.imtlucca.it/id/eprint/3930This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39302018-03-02T10:24:39ZGiustizia e democrazia: Kokoschka rilegge Le Rane di AristofaneOskar Kokoschka (1886-1980) engraved the two drypoints Das Streitgespräch and Das Tribunal between 1967 and 1968, within a series of 12, to illustrate Aristophanes’ The Frogs (Βάτραχοι, 405 BC). These two artworks comment on the concept of justice, stress the artist’s political engagement and directly refer to the political turmoil which struck Greece in the second half of 1960s.Silvia Massasilvia.massa@imtlucca.it2018-02-28T14:01:29Z2018-02-28T14:01:29Zhttp://eprints.imtlucca.it/id/eprint/3925This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39252018-02-28T14:01:29ZUn indicatore di sintesi dell'economicità aziendale: il valore economico del capitaleIn questa parte del volume di esaminano i concetti di capitale economico e le principali logiche di determinazione di questa grandezzaRiccardo GiannettiNicola Lattanzinicola.lattanzi@imtlucca.it2018-02-28T12:50:02Z2018-02-28T12:50:02Zhttp://eprints.imtlucca.it/id/eprint/3924This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39242018-02-28T12:50:02ZRicchezza aziendale e patrimonio intangibile: prospettive di osservazione, strumenti di misura, modelli di rappresentazione e analisiI capitolo tratta del processo di genesi del reddito e della lettura della dimensione qualitativo produttiva degli andamenti aziendali.Nicola Lattanzinicola.lattanzi@imtlucca.it2018-02-28T12:45:11Z2018-02-28T12:45:11Zhttp://eprints.imtlucca.it/id/eprint/3923This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39232018-02-28T12:45:11ZIdentità territoriale e radici familiari strategie di crescitaI capitolo tratta delle rilevanza strategica del connubio fra area a forte vocazione produttiva e profondità delle radici dell'azienda familiare.Nicola Lattanzinicola.lattanzi@imtlucca.itGiulia SantucciMaria Grazia Migliaccio2018-02-28T12:43:27Z2018-02-28T12:43:27Zhttp://eprints.imtlucca.it/id/eprint/3922This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39222018-02-28T12:43:27ZIl Made in Tuscany come competenza identitaria e distintiva dell'azienda familiareIL capitolo tratta della potenziale strategico insito nel Made in Tuscany e indaga alcune possible vie di sfruttamento.Nicola Lattanzinicola.lattanzi@imtlucca.itAlberto GrassiValentina Pieroni2018-02-28T11:37:45Z2018-02-28T11:37:45Zhttp://eprints.imtlucca.it/id/eprint/3920This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39202018-02-28T11:37:45ZNeuroscience Evidence for Economic Humanism in Management Science: Organizational Implications and StrategyGlobalization phenomena and Information & Communication Technology (ICT) are producing deep changes worldwide. The economic environment and society where firms both cooperate and compete with each other are rapidly changing leading firms towards recognizing the role of intangible resources as a source of fresh competitive advantage. Experience, innovation and the ability to create new knowledge completely arise from the act of human resources inviting firms to focus on how to generate and shape knowledge. Therefore, the future of firms depends greatly on how managers are able to explore and exploit human resources. However, without a clear understanding of the nature of human beings and the complexity behind human interactions, we cannot understand the theory of organizational knowledge creation. Thus, how can firms discover, man- age and valorize this “human advantage”? Neuroscience can increase the understanding of how cognitive and emotional processes work; in doing so, we may be able to better understand how individuals involved in a business organization make decisions and how external factors influence their behavior, especially in terms of commitment activation and engagement level. In this respect, a neuroscientific approach to business can support managers in decision-making processes. In a scenario where economic humanism plays a central role in the process of fostering firms’ competitiveness and emerging strategies, we believe that a neuroscience approach in a business organization could be a valid source of value and inspiration for manager decision-making processes.Nicola Lattanzinicola.lattanzi@imtlucca.itLorenzo Dal MasoDario Menicagli2018-02-28T11:28:47Z2018-02-28T11:28:47Zhttp://eprints.imtlucca.it/id/eprint/3919This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39192018-02-28T11:28:47ZLa rilevanza e i caratteri degli aspetti psicologici nella conduzione dell'azienda familiareNelle conduzione delle aziende familiari si intrecciano elementi tipici della psicologia del lavoro e delle organizzazioni con altri di natura familiare, sistemiche di relazione interpersonale normalmente non presenti o comunque poco rilevanti nelle altri situazioni aziendali. II capitolo fornisce alcune chiavi interpretative mediante la lettura delle proposte formulate dalle scuole di pensiero psicologico più significative combinate con la prospettiva di analisi economico aziendale.Ioana CristeaClaudio GentiliNicola Lattanzinicola.lattanzi@imtlucca.itPietro Pietrinipietro.pietrini@imtlucca.itGiuseppina Rota2018-02-28T10:50:02Z2018-02-28T10:50:02Zhttp://eprints.imtlucca.it/id/eprint/3918This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39182018-02-28T10:50:02Z(a cura di) Elementi di bilancio e di management. Vol. 1: Il bilancio di esercizio : principi, schemi e criteri di valutazioneIl volume è interamente dedicato al bilancio di esercizio.Marco AllegriniRiccardo GiannettiNicola Lattanzinicola.lattanzi@imtlucca.itSimone Lazzini2018-02-28T10:45:53Z2018-02-28T10:45:53Zhttp://eprints.imtlucca.it/id/eprint/3917This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39172018-02-28T10:45:53ZLa struttura finanziaria delle società di capitali a conduzione familiareIl contributo tratta delle specificità della conduzione familiare in presenza di società di capitali e dei risvolti sulla possibile struttura finanziaria dell'azienda.Lucia CalvosiNicola Lattanzinicola.lattanzi@imtlucca.it2018-02-16T13:07:53Z2018-02-16T13:07:53Zhttp://eprints.imtlucca.it/id/eprint/3911This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39112018-02-16T13:07:53ZLa lentissima formazione dei musei statali in ItaliaLorenzo Casinilorenzo.casini@imtlucca.it2018-02-16T12:56:58Z2018-02-16T12:56:58Zhttp://eprints.imtlucca.it/id/eprint/3910This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39102018-02-16T12:56:58ZGlobalizzazione e amministrazioni pubbliche nazionali. La dimensione organizzativaLorenzo Casinilorenzo.casini@imtlucca.it2018-02-15T10:42:51Z2018-02-15T10:42:51Zhttp://eprints.imtlucca.it/id/eprint/3901This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39012018-02-15T10:42:51ZLa riforma del Mibact tra mito e realtàThe article deals with the 2014 Reform of the Italian Ministry for Culture and Tourism, which has triggered controversial responses from scholars, media and public opinion. Moving from the actual legal and institutional context, the author aims at demonstrating the weaknesses of the main critiques formulated against the reform. This will allow him to address the main problems that still affect the Ministry and to illustrate the next steps that must be taken in order to help and improve the reformation process.Lorenzo Casinilorenzo.casini@imtlucca.it2018-02-15T10:40:32Z2018-02-15T10:40:32Zhttp://eprints.imtlucca.it/id/eprint/3900This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/39002018-02-15T10:40:32Z«The Loneliness of the Comparative Lawyer». In memoria di John Henry Merryman (1920-2015)Lorenzo Casinilorenzo.casini@imtlucca.it2018-02-15T10:38:16Z2018-02-15T10:38:16Zhttp://eprints.imtlucca.it/id/eprint/3899This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38992018-02-15T10:38:16Z«I Cannot Command Winds and Weather»: uno sguardo inglese sul diritto amministrativo globaleGiulio Napolitano, Barbara Marchetti e Lorenzo Casini commentano il
libro «UK, EU and Global Administrative Law. Foundations and Challenges»
di Paul Craig. Gli autori esaminano gli aspetti più rilevanti tra quelli messi in
luce da questo importante volume, con particolare riferimento alle origini del
diritto amministrativo inglese, l’evoluzione del diritto amministrativo europeo,
l’emersione del diritto amministrativo globale.Lorenzo Casinilorenzo.casini@imtlucca.it2018-01-24T10:34:17Z2018-01-24T10:34:17Zhttp://eprints.imtlucca.it/id/eprint/3873This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38732018-01-24T10:34:17ZDynamic Adverse Selection and the Supply SizeIn this paper we examine the problem of dynamic adverse selection in a stylized market where the quality of goods is a seller's private information while the realized distribution of qualities is public information. We obtain that full trade occurs in every dynamic competitive equilibrium. Moreover, we show that if prices can be conditioned on the supply size then a dynamic competitive equilibrium always exists, while it fails to exist if prices cannot be conditioned on the supply size and the frequency of exchanges is high enough. We conclude that the possibility to condition prices on the supply size allows to reach efficiency in the limit for exchanges becoming more and more frequent, while otherwise the welfare loss due to delays of exchanges remains bounded away from zero.Ennio Bilanciniennio.bilancini@imtlucca.itLeonardo Boncinelli2018-01-24T10:31:32Z2018-01-24T10:31:32Zhttp://eprints.imtlucca.it/id/eprint/3872This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38722018-01-24T10:31:32ZStrict Nash Equilibria in Non-Atomic Games with Strict Single Crossing in Players (or Types) and ActionsIn this paper we study games where the space of player types is atomless and payoff functions satisfy the property of strict single crossing in types and actions. Under an additional assumption of quasisupermodularity in actions of payoff functions and mild assumptions on the type space - partially ordered and with sets of uncomparable types having negligible size - and on the action space - lattice, second countable and satisfying a separation property with respect to the ordering of actions - we prove that every Nash equilibrium is essentially strict. Further, by building on McAdams (2003, Theorem 1), we prove existence of a strict Nash equilibrium and an evolutionarily stable strategy in a general class of incomplete information games satisfying strict single crossing in types and actions.Ennio Bilanciniennio.bilancini@imtlucca.itLeonardo Boncinelli2018-01-24T09:51:21Z2018-01-24T09:51:21Zhttp://eprints.imtlucca.it/id/eprint/3866This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38662018-01-24T09:51:21ZSocial Capital Predicts Happiness Over TimeThe evidence summarized in this chapter documents that GDP is associated with well-being over the business cycle, while this correlation tends to wane over longer periods. Instead, the relationship between social capital and well- being tends to set slowly and to be persistent. This is consistent with both the notion that income is affected by adaptation and social comparisons and, conversely, with the notion that social capital is not affected by these same forces. Moreover, we sum up studies concerning the world-wide most striking cases of the Easterlin paradox, the American, Chinese, and Indian ones. These three countries—amounting to almost half of the world population—exhibit a decline in subjective well-being (SWB), despite a more or less outstanding economic growth in the past few decades. We document that the decline of social capital plays a major role in predicting the decline of happiness in these countries—besides the well-known role of social comparisons. This suggests that the decline of social capital may be a part of the explanation of the Easterlin paradox. Overall, our findings support the view that the basic message of happiness economics should not change, i.e., the centrality of GDP should be reduced. Indeed, social capital—as well as economic growth— can be the target for policies aimed at preserving/fostering it (Helliwell, 2011a; Helliwell, 2011b; Rogers etal., 2011; Bilancini and D’Alessandro, 2012).Stefano BartoliniEnnio Bilanciniennio.bilancini@imtlucca.itFrancesco Sarracino2018-01-24T09:48:09Z2018-01-24T09:48:09Zhttp://eprints.imtlucca.it/id/eprint/3865This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38652018-01-24T09:48:09Z(a cura di) Policies for HappinessIn recent years, debates on the economics of happiness have shown that, over the long-term, well-being is influenced more by social and personal relationships than by income. This evidence challenges the traditional economic policy paradigm that has emphasized income as the primary determinant of well-being. This volume brings together contributions from leading scholars to ask: What should be done to improve the quality of people's lives? Can economic and social changes be made which enhance well-being? What policies are required? How do policies for well-being differ from traditional ones targeted on redistribution, the correction of market inefficiencies, and growth? Are there dimensions of well-being that have been neglected by traditional policies? Is happiness a meaningful policy target? The volume presents reflections and proposals which constitute a first step towards answering these questions.Stefano BartoliniEnnio Bilanciniennio.bilancini@imtlucca.itLuigino BruniPierluigi Porta2017-11-24T07:49:52Z2017-11-24T07:49:52Zhttp://eprints.imtlucca.it/id/eprint/3832This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38322017-11-24T07:49:52ZLa gestualità del dolore rituale tra parole e immaginiThis paper deals with mourning gestures in Roman literary and iconographic productions. Both literature and iconography are thought to be the product of a shared cultural background but, at the same time, they show differences in the way each of them expresses this background: the literary form is dynamic, while the iconographic form seems to be static. Hence, ritual mourning scenes are analyzed either in literature (where some vivid details allow the reader to visualize what is being said through words) or in iconography (where an aoristic image stands for a durative narration). In particular, the analysis focuses on a selection of literary sources that contain vivid descriptions of ritual mourning scenes and date back to a period between the I century BC and the II century AD. They are compared to the conclamatio scenes that are portrayed on some groups of Urban sarcophagi, i.e. children sarcophagi and some mythological ones (depicting the death of Patroclus, Meleager, or Alcestis), all of which date back to the middle Imperial Age. Thus, this paper aims at decoding mourning gestures as well as drawing attention to the similarities and differences that can be detected between the literary horizon and the iconographic one in regard to this particular theme.Elisa Bernardelisa.bernard@imtlucca.it2017-11-23T12:55:34Z2017-11-24T08:52:09Zhttp://eprints.imtlucca.it/id/eprint/3831This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38312017-11-23T12:55:34ZLa Casa di Giulietta di Antonio Avena. Quando l’architettura diventa coup de théȃtreAntonio Avena’s House of Juliet. When architecture becomes a coup de théâtre.
This paper will discuss the “restoration” of the so-called House of Juliet in Verona, which was influenced by Antonio Avena and was finally conducted between November 1939 and April 1941. The data from topographical, archaeological and archival sources, together with the historical guide books and postcards, reveal that, up until the 20th century, the location of Juliet’s House was understood to be the casa torre facing via Cappello. After a long series of proposed restoration projects dating back to the early 20th century, the restoration conducted by A. Avena created Juliet’s House as we know it today in the building facing the inner courtyard. Through the study of the archival sources, old photographs and the archaeological analysis of the facies of the building, it is possible to acquire an idea of how the restoration was made and the underlying ideas of the project. The restoration consisted in a combination of recycled fragments, “antiqua spolia”, and architectonical elements emulating ancient ones, in order to create an evocative pastiche. It was meant to recreate a medieval atmosphere and to create a backdrop to the Shakespearian drama.Elisa Bernardelisa.bernard@imtlucca.it2017-09-29T09:27:58Z2017-09-29T09:27:58Zhttp://eprints.imtlucca.it/id/eprint/3816This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38162017-09-29T09:27:58Z(a cura di) Atti del Convegno GIMC-GMA 2016Marco Paggimarco.paggi@imtlucca.itAndrea Bacigalupoandrea.bacigalupo@imtlucca.itStefano BennatiClaudia Borriclaudia.borri@imtlucca.itMauro Corradomauro.corrado@polito.itAndrea GizziPaolo Sebastiano Valvo2017-09-29T09:03:06Z2017-09-29T09:03:06Zhttp://eprints.imtlucca.it/id/eprint/3817This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38172017-09-29T09:03:06ZPercolation properties of the free volume generated by two rough surfaces in contactThe mechanism of fluid leakage trough the free volume between rough surfaces in
contact is relevant in physics and in many engineering applications. In the present study, the
normal contact problem between randomly generated fractal rough surfaces is solved using the
boundary element method. Then, an algorithm for the evaluation of the network involved in
the percolation of fluid is proposed. Numerical results are synthetically collected in diagrams
relating the free volume involved in the percolation to the dimensionless statistical parameters
of the rough surfacePaolo Cinatpaolo.cinat@imtlucca.itMarco Paggimarco.paggi@imtlucca.itClaudia Borriclaudia.borri@imtlucca.it2017-09-28T15:30:32Z2017-09-28T15:30:32Zhttp://eprints.imtlucca.it/id/eprint/3812This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/38122017-09-28T15:30:32ZTopology simulation and contact mechanics of bifractal rough surfaces*A numerical method to generate bifractal surfaces due to a modification of the slope of the power spectral density function in the low- or high-frequency range is proposed. The method has been applied to simulate real surfaces of Ginkgo Biloba leaf scanned at two different magnifications by matching the corresponding experimental power spectral densities. Slight differences have been found in the statistical distributions of the asperity heights and curvatures for the lowest magnification that had marginal influence on the frictionless normal contact response of the surface. For highest magnification, however, the statistics of the simulated numerical surface were quite different from those of the real one, leading also to a significant difference in the normal contact results.Claudia Borriclaudia.borri@imtlucca.itMarco Paggimarco.paggi@imtlucca.it2017-09-18T12:26:07Z2017-09-18T12:26:07Zhttp://eprints.imtlucca.it/id/eprint/3789This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37892017-09-18T12:26:07ZPassive control of wave propagation in periodic anti-tetrachiral meta-materialsPeriodic anti-tetrachiral materials are strongly characterized by a marked auxeticity,
the unusual and fascinating mechanical property mathematically expressed by negative values
of the Poisson’s ratio. The auxetic behavior is primarily provided by pervasive rolling-up mechanisms
developed by the doubly-symmetric micro-structure of the periodic cell, composed by a
regular pattern of rigid rings connected by tangent flexible ligaments. Adopting a beam-lattice
model to describe the linear free dynamics of the elementary cell, the planar wave propagation
along the bi-dimensional material domain can be studied according to the Floquet-Bloch
theory. Parametric analyses of the dispersion curves, carried out with numerical or asymptotic
tools, typically reveal a highly-dense spectrum, with persistent absence of total band-gaps in the
low-frequency range. The paper analyses the wave propagation in the meta-material developed
by introducing rigid massive inserts, locally housed by all the rings and working as undamped
linear oscillators with assigned inertia and/or stiffness properties. The elastic coupling between
the cell microstructure and the oscillators, if properly tuned (inertial resonators), is found to
significantly modify the Floquet-Bloch spectrum of the material. The effects of the resonator
parameters (tuning frequency and mass ratio) on the low-frequency band structure of the metamaterial
are discussed, with focus on the valuable possibility to (i) open total band gaps, by
either the widening of an existing partial band gap or the avoidance of a crossing point between
adjacent dispersion curves, (ii) finely control the total band-gap amplification, in order to assess
the maximum achievable performance of the meta-material against the vibration propagationMarco LepidiAndrea Bacigalupoandrea.bacigalupo@imtlucca.it2017-09-18T08:51:48Z2017-09-18T08:51:48Zhttp://eprints.imtlucca.it/id/eprint/3784This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37842017-09-18T08:51:48ZSimplified modelling of chiral lattice materials with local resonatorsA simplified model of periodic chiral beam-lattices containing local resonators has been formulated to obtain a better understanding of the influence of the chirality and of the dynamic characteristics of the local resonators on the acoustic behaviour. The beam-lattice models are made up of a periodic array of rigid heavy rings, each one connected to the others through elastic slender massless ligaments and containing an internal resonator made of a rigid disk in a soft elastic annulus. The band structure and the occurrence of low frequency band-gaps are analysed through a discrete Lagrangian model. For both the hexa- and the tetrachiral lattice, two acoustic modes and four optical modes are identified and the influence of the dynamic characteristics of the resonator on those branches is analysed together with some properties of the band structure. By approximating the ring displacements of the discrete Lagrangian model as a continuum field and through an application of the generalized macro-homogeneity condition, a generalized micropolar equivalent continuum has been derived, together with the overall equation of motion and the constitutive equation given in closed form. The validity limits of the dispersion functions provided by the micropolar model are assessed by a comparison with those obtained by the discrete model.Andrea Bacigalupoandrea.bacigalupo@imtlucca.itLuigi Gambarotta2017-09-04T14:50:49Z2017-09-04T14:50:49Zhttp://eprints.imtlucca.it/id/eprint/3779This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37792017-09-04T14:50:49ZRegionally specific features of low-frequency EEG oscillations during REM-sleepGiulio BernardiMonica Bettamonica.betta@imtlucca.itYu XiaoqianEmiliano Ricciardiemiliano.ricciardi@imtlucca.itJ. Haba-RubioR. HeinzerPietro Pietrinipietro.pietrini@imtlucca.itGiulio TononiFrancesca Siclari2017-09-04T14:47:55Z2017-09-04T14:47:55Zhttp://eprints.imtlucca.it/id/eprint/3778This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37782017-09-04T14:47:55ZSpontaneous, localized EEG activations in REM sleep: an high-density EEG investigationMonica Bettamonica.betta@imtlucca.itGiulio BernardiDanilo MenicucciJ. Haba-RubioR. HeinzerAngelo GemignaniAlberto LandiGiulio TononiFrancesca Siclari2017-09-04T13:40:21Z2017-09-04T13:40:21Zhttp://eprints.imtlucca.it/id/eprint/3770This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37702017-09-04T13:40:21ZPerspectives on optimal control of varicella and herpes zoster by mass routine varicella vaccinationHerpes zoster arises from reactivation of the varicella–zoster virus (VZV), causing varicella in children. As reactivation occurs when cell-mediated immunity (CMI) declines, and there is evidence that re-exposure to VZV boosts CMI, mass varicella immunization might increase the zoster burden, at least for some decades. Fear of this natural zoster boom is the main reason for the paralysis of varicella immunization in Europe. We apply optimal control to a realistically parametrized age-structured model for VZV transmission and reactivation to investigate whether feasible varicella immunization paths that are optimal in controlling both varicella and zoster exist. We analyse the optimality system numerically focusing on the role of the cost functional, of the relative zoster–varicella cost and of the planning horizon length. We show that optimal programmes will mostly be unfeasible for public health owing to their complex temporal profiles. This complexity is the consequence of the intrinsically antagonistic nature of varicella immunization programmes when aiming to control both varicella and zoster. However, we show that gradually increasing—hence feasible—vaccination schedules can perform better than routine programmes with constant vaccine uptake. Finally, we show the optimal profiles of feasible programmes targeting mitigation of the post-immunization natural zoster boom with priority.Monica Bettamonica.betta@imtlucca.itMarco LaurinoAndrea PuglieseGiorgio GuzzettaAlberto LandiPiero Manfredi2017-08-08T07:44:36Z2017-08-08T07:44:36Zhttp://eprints.imtlucca.it/id/eprint/3764This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37642017-08-08T07:44:36ZFluid Analysis of Spatio-Temporal Properties of Agents in a Population ModelWe consider large stochastic population models in which heterogeneous agents are interacting locally and moving in space. These models are very common, e.g. in the context of mobile wireless networks, crowd dynamics, traffic management, but they are typically very hard to analyze, even when space is discretized in a grid. Here we consider individual agents and look at their properties, e.g. quality of service metrics in mobile networks. Leveraging recent results on the combination of stochastic approximation with formal verification, and of fluid approximation of spatio-temporal population processes, we devise a novel mean-field based approach to check such behaviors, which requires the solution of a low-dimensional set of Partial Differential Equation, which is shown to be much faster than simulation. We prove the correctness of the method and validate it on a mobile peer-to-peer network example.Luca BortolussiMax Tschaikowskimax.tschaikowski@imtlucca.it2017-08-07T10:39:49Z2018-03-08T16:56:22Zhttp://eprints.imtlucca.it/id/eprint/3763This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37632017-08-07T10:39:49ZA generation-attraction model for renewable energy flows in Italy: A complex network approachIn recent years, in Italy, the trend of the electricity demand and the need to connect a large number of renewable energy power generators to the power-grid, developed a novel type of energy transmission/distribution infrastructure. The Italian Transmission System Operator (TSO) and the Distribution System Operator (DSO), worked on a new infrastructural model, based on electronic meters and information technology. In pursuing this objective it is crucial importance to understand how even more larger shares of renewable energy can be fully integrated, providing a constant and reliable energy background over space and time. This is particularly true for intermittent sources as photovoltaic installations due to the fine-grained distribution of them across the Country. In this work we use an over-simplified model to characterize the Italian power grid as a graph whose nodes are Italian municipalities and the edges cross the administrative boundaries between a selected municipality and its first neighbours, following a Delaunay triangulation. Our aim is to describe the power flow as a diffusion process over a network, and using open data on the solar irradiation at the ground level, we estimate the production of photovoltaic energy in each node. An attraction index was also defined using demographic data, in accordance with average per capita energy consumption data. The available energy on each node was calculated by finding the stationary state of a generation-attraction model.Luca ValoriGiovanni Luca GiannuzziAngelo Facchiniangelo.facchini@imtlucca.itTiziano Squartinitiziano.squartini@imtlucca.itDiego Garlaschellidiego.garlaschelli@imtlucca.itRiccardo Basosi2017-08-04T11:26:13Z2017-08-04T11:26:13Zhttp://eprints.imtlucca.it/id/eprint/3756This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37562017-08-04T11:26:13ZInvestigating the interplay between fundamentals of national research systems: Performance, investments and international collaborationsAbstract We discuss, at the macro-level of nations, the contribution of research funding and rate of international collaboration to research performance, with important implications for the “science of science policy”. In particular, we cross-correlate suitable measures of these quantities with a scientometric-based assessment of scientific success, studying both the average performance of nations and their temporal dynamics in the space defined by these variables during the last decade. We find significant differences among nations in terms of efficiency in turning (financial) input into bibliometrically measurable output, and we confirm that growth of international collaboration positively correlate with scientific success—with significant benefits brought by {EU} integration policies. Various geo-cultural clusters of nations naturally emerge from our analysis. We critically discuss the factors that potentially determine the observed patterns.Giulio Ciminigiulio.cimini@imtlucca.itAndrea ZaccariaAndrea Gabrielli2017-08-04T11:22:51Z2017-08-04T11:22:51Zhttp://eprints.imtlucca.it/id/eprint/3755This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37552017-08-04T11:22:51ZEntangling Credit and Funding Shocks in Interbank MarketsCredit and liquidity shocks represent main channels of financial contagion for interbank lending markets. On one hand, banks face potential losses whenever their counterparties are under distress and thus unable to fulfill their obligations. On the other hand, solvency constraints may force banks to recover lost fundings by selling their illiquid assets, resulting in effective losses in the presence of fire sales—that is, when funding shortcomings are widespread over the market. Because of the complex structure of the network of interbank exposures, these losses reverberate among banks and eventually get amplified, with potentially catastrophic consequences for the whole financial system. Inspired by the recently proposed Debt Rank, in this work we define a systemic risk metric that estimates the potential amplification of losses in interbank markets accounting for both credit and liquidity contagion channels: the Debt-Solvency Rank. We implement this framework on a dataset of 183 European banks that were publicly traded between 2004 and 2013, showing indeed that liquidity spillovers substantially increase systemic risk, and thus cannot be neglected in stress-test scenarios. We also provide additional evidence that the interbank market was extremely fragile up to the global financial crisis, becoming slightly more robust only afterwards.Giulio Ciminigiulio.cimini@imtlucca.itMatteo Serri2017-08-04T11:01:58Z2017-08-04T11:01:58Zhttp://eprints.imtlucca.it/id/eprint/3754This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37542017-08-04T11:01:58ZModel-based evaluation of scientific impact indicatorsUsing bibliometric data artificially generated through a model of citation dynamics calibrated on empirical data, we compare several indicators for the scientific impact of individual researchers. The use of such a controlled setup has the advantage of avoiding the biases present in real databases, and it allows us to assess which aspects of the model dynamics and which traits of individual researchers a particular indicator actually reflects. We find that the simple average citation count of the authored papers performs well in capturing the intrinsic scientific ability of researchers, regardless of the length of their career. On the other hand, when productivity complements ability in the evaluation process, the notorious h and g indices reveal their potential, yet their normalized variants do not always yield a fair comparison between researchers at different career stages. Notably, the use of logarithmic units for citation counts allows us to build simple indicators with performance equal to that of h and g. Our analysis may provide useful hints for a proper use of bibliometric indicators. Additionally, our framework can be extended by including other aspects of the scientific production process and citation dynamics, with the potential to become a standard tool for the assessment of impact metrics.Matúš MedoGiulio Ciminigiulio.cimini@imtlucca.it2017-08-04T10:57:35Z2017-08-04T10:57:35Zhttp://eprints.imtlucca.it/id/eprint/3753This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37532017-08-04T10:57:35ZStatistically validated network of portfolio overlaps and systemic riskCommon asset holding by financial institutions (portfolio overlap) is nowadays regarded as an important channel for financial contagion with the potential to trigger fire sales and severe losses at the systemic level. We propose a method to assess the statistical significance of the overlap between heterogeneously diversified portfolios, which we use to build a validated network of financial institutions where links indicate potential contagion channels. The method is implemented on a historical database of institutional holdings ranging from 1999 to the end of 2013, but can be applied to any bipartite network. We find that the proportion of validated links (i.e. of significant overlaps) increased steadily before the 2007–2008 financial crisis and reached a maximum when the crisis occurred. We argue that the nature of this measure implies that systemic risk from fire sales liquidation was maximal at that time. After a sharp drop in 2008, systemic risk resumed its growth in 2009, with a notable acceleration in 2013. We finally show that market trends tend to be amplified in the portfolios identified by the algorithm, such that it is possible to have an informative signal about institutions that are about to suffer (enjoy) the most significant losses (gains).Stanislao GualdiGiulio Ciminigiulio.cimini@imtlucca.itKevin PrimicerioRiccardo Di ClementeDamien Challet2017-05-24T09:04:21Z2017-05-24T09:04:21Zhttp://eprints.imtlucca.it/id/eprint/3708This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/37082017-05-24T09:04:21ZSmall and Medium Enterprises’ Competitiveness through Global Value ChainsDavide Del Pretedavide.delprete@imtlucca.itGiorgia GiovannettiEnrico Marvasi2017-05-08T12:27:18Z2017-05-08T12:27:18Zhttp://eprints.imtlucca.it/id/eprint/3697This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36972017-05-08T12:27:18Z(edited by) Proceedings of the 31st Annual ACM Symposium on Applied Computing, SAC 2016, Special track on service-oriented architectures and programming (SOAP)The SOAP track aims at bringing together researchers and practitioners having the common objective of transforming Service-Oriented Programming (SOP) into a mature discipline with both solid scientific foundations and mature software engineering development methodologies supported by dedicated tools. From the foundational point of view, many attempts to use formal methods for specification and verification in this setting have been made. Session correlation, service types, contract theories, and communication patterns are only a few examples of the aspects that have been investigated. Moreover, several formal models based upon automata, Petri nets and algebraic approaches have been developed. However, most of these approaches concentrate only on a few features of service-oriented systems in isolation, and a comprehensive approach is still lacking.
From the engineering point of view, there are open issues at many levels. Among others, at the system design level, both traditional approaches based on UML and approaches taking inspiration from Business Process Modelling, e.g. BPMN, are used. At the composition level, orchestration and choreography are continuously being improved both formally and practically, with an evident need for their integration in the development process. At the description and discovery level, there are two separate communities pushing respectively the semantic approach (like ontologies and OWL) and the syntactic one (like WSDL). In particular, the role of discovery engines and protocols is not clear. In this respect, adopted standards are still missing. UDDI looked to be a good candidate, but it is no longer pushed by the main corporations, and its wide adoption seems difficult. Furthermore, a recent implementation platform, the so-called REST services, is emerging and competing with classic Web Services. Finally, features like Quality of Service, security, and dependability need to be taken seriously into account.
SOAP in particular encouraged submissions on what SOP still needs in order to achieve the above goals.
The PC of SOAP 2016 was formed by:
• Farhad Arbab Leiden University and CWI, Amsterdam, NL
• Luís Barbosa University of Minho, Braga, PT
• Massimo Bartoletti Università di Cagliari, IT
• Maurice H. ter Beek ISTI-CNR, Pisa, IT (co-chair)
• Marcello M. Bersani Politecnico di Milano, IT
• Laura Bocchi University of Kent, UK
• Roberto Bruni Università di Pisa, IT
• Marco Carbone IT University of Copenhagen, DK
• Romain Demangeon Université Pierre et Marie Curie, FR
• Schahram Dustdar Vienna University of Technology, AT
• Alessandra Gorla IMDEA Software Institute, Madrid, ES
• Vasileios Koutavas Trinity College Dublin, IE
• Alberto Lluch Lafuente Technical University of Denmark, DK
• Manuel Mazzara Innopolis University, RU
• Hernán Melgratti University of Buenos Aires, AR (co-chair)
• Nicola Mezzetti University of Trento, IT
• Corrado Moiso Telecom Italia, IT
• Alberto Núñez Universidad Complutense de Madrid, ES
• Jorge A. Perez University of Groningen, NL
• Gustavo Petri Purdue University, USA
• António Ravara New University of Lisbon, PT
• Steve Ross-Talbot Cognizant Technology Solutions, UK
• Gwen Salaün Inria Grenoble - Rhône-Alpes, FR
• Francesco Tiezzi Università di Camerino, IT
• Hugo Torres Vieira IMT Lucca, IT (co-chair)
• Emilio Tuosto University of Leicester, UK
• Massimo Vecchio Università degli Studi eCampus, IT
• Peter Wong Travelex, UK
• Yongluan Zhou University of Southern Denmark, DK
SOAP 2016 received a total of 16 submissions. Each submission was reviewed by at least 4 PC members, the vast majority even by 5 PC members. All papers were subject to an animated general discussion among the PC members (with over 100 posts in the message boards). In the end, the PC decided to select only the following four papers for an oral presentation at the conference (an acceptance rate of 25%):
• JxActinium: a runtime manager for secure REST-ful COAP applications working over JXTA by Filippo Battaglia, Giancarlo Iannizzotto, and Lucia Lo Bello
• Improving QoS Delivered by WS-BPEL Scenario Adaptation through Service Execution Parallelization by Dionisis Margaris, Costas Vassilakis, and Panagiotis Georgiadis
• QoS-aware Adaptation for Complex Event Service by Feng Gao, Muhammad Ali, Edward Curry, and Alessandra Mileo
• Service functional testing automation with intelligent scheduling and planning by Lom Messan Hillah, Ariele-Paolo Maesano, Libero Maesano, Fabio De Rosa, Fabrice Kordon, and Pierre-Henri Wuillemin
We would like to thank the PC members, and a few external reviewers, for their detailed reports and the stimulating discussions during the reviewing phase; the authors of submitted papers, the session chairs and the attendees, for contributing to the success of the event; the providers of the START system, which was used to manage the submissions; and in particular all the organizers of SAC 2016, for their invitation to organize this track and for all their excellent assistance and support.Maurice H. ter BeekHernán C. MelgrattiHugo Torres Vieirahugo.torresvieira@imtlucca.it2017-05-04T14:16:41Z2017-05-04T14:16:41Zhttp://eprints.imtlucca.it/id/eprint/3695This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36952017-05-04T14:16:41ZFoundations of Session Types and Behavioural ContractsBehavioural type systems, usually associated to concurrent or distributed computations, encompass concepts such as interfaces, communication protocols, and contracts, in addition to the traditional input/output operations. The behavioural type of a software component specifies its expected patterns of interaction using expressive type languages, so types can be used to determine automatically whether the component interacts correctly with other components. Two related important notions of behavioural types are those of session types and behavioural contracts. This article surveys the main accomplishments of the last 20 years within these two approaches.Hans HuttelIvan LaneseVasco Thudichum VasconcelosLuis CairesMarco CarbonePierre-Malo DeniélouDimitris MostrousLuca PadovaniAntonio RavaraEmilio TuostoHugo Torres Vieirahugo.torresvieira@imtlucca.itGianluigi Zavattaro2017-05-04T14:10:39Z2017-05-04T14:10:39Zhttp://eprints.imtlucca.it/id/eprint/3694This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36942017-05-04T14:10:39ZDynamic role authorization in multiparty conversationsProtocols in distributed settings usually rely on the interaction of several parties and often identify the roles involved in communications. Roles may have a behavioral interpretation, as they do not necessarily correspond to sites or physical devices. Notions of role authorization thus become necessary to consider settings in which, e.g., different sites may be authorized to act on behalf of a single role, or in which one site may be authorized to act on behalf of different roles. This flexibility must be equipped with ways of controlling the roles that the different parties are authorized to represent, including the challenging case in which role authorizations are determined only at runtime. We present a typed framework for the analysis of multiparty interaction with dynamic role authorization and delegation. Building on previous work on conversation types with role assignment, our formal model is based on an extension of the π-calculus in which the basic resources are pairs channel-role, which denote the access right of interacting along a given channel representing the given role. To specify dynamic authorization control, our process model includes (1) a novel scoping construct for authorization domains, and (2) communication primitives for authorizations, which allow to pass around authorizations to act on a given channel. An authorization error then corresponds to an action involving a channel and a role not enclosed by an appropriate authorization scope. We introduce a typing discipline that ensures that processes never reduce to authorization errors, including when parties dynamically acquire authorizations.Silvia GhilezanSvetlana JakšićJovanka PantovićJorge A. PérezHugo Torres Vieirahugo.torresvieira@imtlucca.it2017-05-04T13:43:26Z2017-05-04T13:43:26Zhttp://eprints.imtlucca.it/id/eprint/3692This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36922017-05-04T13:43:26Z(edited by) Proceedings 9th Interaction and Concurrency Experience, ICE 2016, Heraklion, Greece, 8-9 June 2016This volume contains the proceedings of ICE 2016, the 9th Interaction and Concurrency Experience, which was held in Heraklion, Greece on the 8th and 9th of June 2016 as a satellite event of DisCoTec 2016. The ICE procedure for paper selection allows PC members to interact, anonymously, with authors. During the review phase, each submitted paper is published on a discussion forum whose access is restricted to the authors and to all the PC members not declaring a conflict of interest. The PC members post comments and questions that the authors reply to. For the first time, the 2016 edition of ICE included a feature targeting review transparency: reviews of accepted papers were made public on the workshop website and workshop participants in particular were able to access them during the workshop. Each paper was reviewed by three PC members, and altogether nine papers were accepted for publication (the workshop also featured three brief announcements which are not part of this volume). We were proud to host two invited talks, by Alexandra Silva and Uwe Nestmann. The abstracts of these two talks are included in this volume together with the regular papers.Massimo BartolettiLudovic HenrioSophia KnightHugo Torres Vieirahugo.torresvieira@imtlucca.it2017-05-04T13:42:01Z2017-05-04T13:42:01Zhttp://eprints.imtlucca.it/id/eprint/3693This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36932017-05-04T13:42:01ZPreface for the special issue on Interaction and Concurrency Experience 2014This special issue contains extended versions of selected papers from the 7th Interaction and Concurrency Experience workshop (ICE 2014). The workshop was held in Berlin (Germany) on June 6th, 2014. ICE workshops form a series of international scientific meetings oriented to theoretical computer science researchers with special interest in models, verification, tools, and programming primitives for complex interactions.
The general scope of the venue includes theoretical and applied aspects of interactions and the synchronization mechanisms used among components of concurrent/distributed systems, related to several areas of computer science in the broad spectrum ranging from formal specification and analysis to studies inspired by emerging computational models.
The authors of the most prominent papers presented at ICE 2014 were invited to submit an extended version to this special issue. In order to guarantee the fairness and quality of the selection process, each submission received at least three reviews. The review process has also ensured that the accepted articles significantly extend and improve the original workshop contributions.
This special issue features three articles:
• Declarative event based models of concurrency and refinement in psi-calculi, by Håkon Normann, Christian Johansen and Thomas Hildebrandt. In this paper the authors show an exploration of declarative event-based specifications open to runtime refinement aiming at a declarative model with support for adaptation.
• Contracts as games on event structures, by Massimo Bartoletti, Tiziana Cimoli, G. Michele Pinna and Roberto Zunino. This work presents an event structure based interpretation of contracts, allowing to study the rights and obligations of contract participants in a natural setting.
• Relating two automata-based models of orchestration and choreography, by Davide Basile, Pierpaolo Degano, Gian Luigi Ferrari and Emilio Tuosto. This paper presents a comparison between local contract-based specifications coordinated by orchestrators with communicating machines that have decentralized coordination.
We want to thank all the authors who contributed to this volume. We would like to thank all the members of the Program Committee of ICE, who helped us in the selection of the papers and who helped the authors to improve their contributions in several ways. Additional referees were involved in the review of the papers invited for this special issue and we thank their timely contributions. We would also like to thank the editors of JLAMP, for their support during the whole editorial process.Ivan LaneseAlberto Lluch LafuenteAna SokolovaHugo Torres Vieirahugo.torresvieira@imtlucca.it2017-03-21T14:01:17Z2017-03-21T14:01:17Zhttp://eprints.imtlucca.it/id/eprint/3679This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36792017-03-21T14:01:17ZRevisiting the problem of a crack impinging on an interface: a modeling framework for the interaction between the phase field approach for brittle fracture and the interface cohesive zone modelThe problem of a crack impinging on an interface has been thoroughly investigated in the last three decades due to its important role in the mechanics and physics of solids. In this investigation, this problem is revisited in view of the recent progresses on the phase field approach to brittle fracture. In this concern, a novel formulation combining the phase field approach for modeling brittle fracture in the bulk and a cohesive zone model for pre-existing adhesive interfaces is herein proposed to investigate the competition between crack penetration and deflection at an interface. The model, implemented within the finite element method framework using a monolithic fully implicit solution strategy, is applied to provide a further insight into the understanding of the role of model parameters on the above competition. In particular, in this study, the role of the fracture toughness ratio between the interface and the adjoining bulks and the characteristic fracture-length scales of the dissipative models are analyzed. In the case of a brittle interface, the asymptotic predictions based on linear elastic fracture mechanics criteria for crack penetration, single deflection or double deflection are fully captured by the present method. Moreover, by increasing the size of the process zone along the interface, or by varying the internal length scale of the phase field model, new complex phenomena are emerging, such as simultaneous crack penetration and deflection and the transition from single crack penetration to deflection and penetration with subsequent branching into the bulk.Marco Paggimarco.paggi@imtlucca.itJosé Reinoso2017-03-21T13:49:11Z2017-09-04T13:34:32Zhttp://eprints.imtlucca.it/id/eprint/3678This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36782017-03-21T13:49:11ZLow-frequency oscillations in REM-sleep: a high density
EEG studyObjectives: Slow waves (0.5–4 Hz) of non-rapid eye movement
(NREM) sleep occur and are regulated locally, in an experiencedependent
manner. However, recent work in mice showed that
region-specific slow waves may also occur in REM sleep. Here we
investigated the presence and cortical distribution of low-frequency
oscillations in human REM sleep using high-density EEG.Giulio BernardiMonica Bettamonica.betta@imtlucca.itX. YuEmiliano Ricciardiemiliano.ricciardi@imtlucca.itJ. Haba-RubioR. HeinzerPietro Pietrinipietro.pietrini@imtlucca.itGiulio TononiFrancesca Siclari2017-03-21T12:12:26Z2017-03-21T12:12:26Zhttp://eprints.imtlucca.it/id/eprint/3672This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36722017-03-21T12:12:26ZOverall thermomechanical properties of layered materials for energy devices applicationsThis paper is concerned with the analysis of effective thermomechanical properties of multi-layered materials of interest for solid oxide fuel cells (SOFC) and lithium ions batteries fabrication. The recently developed asymptotic homogenization procedure is applied in order to express the overall thermoelastic constants of the first order equivalent continuum in terms of microfluctuations functions, and these functions are obtained by the solution of the corresponding recursive cell problems. The effects of thermal stresses on periodic multi-layered thermoelastic composite reproducing the characteristics of solid oxide fuel cells (SOFC-like) are studied assuming periodic body forces and heat sources, and the solution derived by means of the asymptotic homogenization approach is compared with the results obtained by finite elements analysis of the associate heterogeneous material.Andrea Bacigalupoandrea.bacigalupo@imtlucca.itLorenzo MoriniAmdrea Piccolroaz2017-03-21T12:00:57Z2017-03-21T12:00:57Zhttp://eprints.imtlucca.it/id/eprint/3669This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36692017-03-21T12:00:57ZDispersive wave propagation in two-dimensional rigid periodic blocky materials with elastic interfacesDispersive waves in two-dimensional blocky materials with periodic microstructure made up of equal rigid units having polygonal centro-symmetric shape with mass and gyroscopic inertia, connected each other through homogeneous linear interfaces, have been analysed. The acoustic behavior of the resulting discrete Lagrangian model has been obtained through a Floquet-Bloch approach. From the resulting eigenproblem derived by the Euler-Lagrange equations for harmonic wave propagation, two acoustic branches and an optical branch are obtained in the frequency spectrum. A micropolar continuum model to approximate the Lagrangian model has been derived based on a second-order Taylor expansion of the generalized macro-displacement field. The constitutive equations of the equivalent micropolar continuum have been obtained, with the peculiarity that the positive definiteness of the second-order symmetric tensor associated to the curvature vector is not guaranteed and depends both on the ratio between the local tangent and normal stiffness and on the block shape. The same results has been obtained through an extended Hamiltonian derivation of the equations of motion for the equivalent continuum that is related to the Hill-Mandel macro homogeneity condition. Moreover, it is shown that the hermitian matrix governing the eigenproblem of harmonic wave propagation in the micropolar model is exact up to the second order in the norm of the wave vector with respect to the same matrix from the discrete model. To appreciate the acoustic behavior of some relevant blocky materials and to understand the reliability and the validity limits of the micropolar continuum model, some blocky patterns have been analysed: rhombic and hexagonal assemblages and running bond masonry.Andrea Bacigalupoandrea.bacigalupo@imtlucca.itLuigi Gambarotta2017-03-21T11:56:47Z2017-03-21T11:56:47Zhttp://eprints.imtlucca.it/id/eprint/3668This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36682017-03-21T11:56:47ZHigh-frequency parametric approximation of the Floquet-Bloch spectrum for anti-tetrachiral materialsThe class of anti-tetrachiral cellular materials is phenomenologically characterized by a strong auxeticity of the elastic macroscopic response. The auxetic behavior is activated by rolling-up deformation mechanisms developed by the material microstructure, composed by a periodic pattern of stiff rings connected by flexible ligaments. A linear beam lattice model is formulated to describe the free dynamic response of the periodic cell, in the absence of a soft matrix. After a static condensation of the passive degrees-of-freedom, a general procedure is applied to analyze the wave propagation in the low-dimensional space of the active degrees-of-freedom. The exact dispersion functions are compared with explicit – although approximate – dispersion relations, obtained from asymptotic perturbation solutions of the eigenproblem governing the Floquet–Bloch theory. A general hierarchical scheme is outlined to formulate and solve the perturbation equations, taking into account the dimension of the perturbation vector. Original recursive formulas are presented to achieve any desired order of asymptotic approximation. For the anti-tetrachiral material, the fourth-order asymptotic solutions are found to approximate the dispersion curves with fine agreement over wide regions of the parameter space. The asymptotic eigensolutions allow an accurate sensitivity analysis of the material spectrum under variation of the key physical parameters, including the cell aspect ratio, the ligament slenderness and the spatial ring density. Finally, the explicit dependence of the dispersion functions on the mechanical parameters may facilitate the custom design of specific spectral properties, such as the wave velocities and band gap amplitudes.Andrea Bacigalupoandrea.bacigalupo@imtlucca.itMarco Lepidi2017-03-21T11:05:10Z2017-09-21T14:56:38Zhttp://eprints.imtlucca.it/id/eprint/3666This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36662017-03-21T11:05:10ZOptimal design of low-frequency band gaps in anti-tetrachiral lattice meta-materialsThe elastic wave propagation is investigated in a beam lattice material characterized by a square periodic cell with anti-tetrachiral microstructure. With reference to the Floquet-Bloch spectrum, focus is made on the band structure enrichments and modifications which can be achieved by equipping the cellular microstructure with tunable local resonators. By virtue of its composite mechanical nature, the so-built inertial meta-material gains enhanced capacities of passive frequency-band filtering. Indeed the number, placement and properties of the inertial resonators can be designed to open, shift and enlarge the band gaps between one or more pairs of consecutive branches in the frequency spectrum. In order to improve the meta-material performance, several nonlinear optimization problems are formulated. The largest among the band gap amplitudes in the low-frequency range is selected as suited objective function. Proper inequality constraints are introduced to restrict the admissible solutions within a compact set of mechanical and geometric parameters, including only physically realistic properties of both the lattice and the resonators. The optimization problems related to full and partial band gaps are solved by using a globally convergent version of the numerical method of moving asymptotes, combined with a quasi-Monte Carlo multi-start technique. The optimal solutions are numerically computed, discussed and compared from the qualitative and quantitative viewpoints, bringing to light the limits and potential of the meta-material performance. The clearest trends emerging from the numerical analyses are pointed out and interpreted from the physical viewpoint. Finally, some specific recommendations about the microstructural design of the meta-material are synthesized.Andrea Bacigalupoandrea.bacigalupo@imtlucca.itGiorgio Gneccogiorgio.gnecco@imtlucca.itMarco LepidiLuigi Gambarotta2017-03-21T10:56:30Z2017-03-21T10:56:30Zhttp://eprints.imtlucca.it/id/eprint/3664This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36642017-03-21T10:56:30ZDesign of acoustic metamaterials through nonlinear programmingThe dispersive wave propagation in a periodic metamaterial with tetrachiral topology and inertial local resonators is investigated. The Floquet-Bloch spectrum of the metamaterial is compared with that of the tetrachiral beam lattice material without resonators. The resonators can be designed to open and shift frequency band gaps, that is, spectrum intervals in which harmonic waves do not propagate. Therefore, an optimal passive control of the frequency band structure can be pursued in the metamaterial. To this aim, a suitable constrained nonlinear optimization problem on a compact set of admissible geometrical and mechanical parameters is stated. According to functional requirements, the particular set of parameters which determines the largest low-frequency band gap between a pair of consecutive branches of the Floquet-Bloch spectrum is obtained. The optimization problem is successfully solved by means of a version of the method of moving asymptotes, combined with a quasi-Monte Carlo multi-start technique.
Subjects: Materials Science (cond-mat.mtrl-sci)
Cite as: arXiv:1603.07717 [cond-mat.mtrl-sci]
(or arXiv:1603.07717v2 [cond-mat.mtrl-sci] for this version)Andrea Bacigalupoandrea.bacigalupo@imtlucca.itGiorgio Gneccogiorgio.gnecco@imtlucca.itMarco LepidiLuigi Gambarotta2017-03-09T10:12:31Z2017-03-09T10:12:31Zhttp://eprints.imtlucca.it/id/eprint/3659This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36592017-03-09T10:12:31ZThe Accounting Network: How Financial Institutions React to Systemic CrisisThe role of Network Theory in the study of the financial crisis has been widely spotted in the latest years. It has been shown how the network topology and the dynamics running on top of it can trigger the outbreak of large systemic crisis. Following this methodological perspective we introduce here the Accounting Network, i.e. the network we can extract through vector similarities techniques from companies’ financial statements. We build the Accounting Network on a large database of worldwide banks in the period 2001–2013, covering the onset of the global financial crisis of mid-2007. After a careful data cleaning, we apply a quality check in the construction of the network, introducing a parameter (the Quality Ratio) capable of trading off the size of the sample (coverage) and the representativeness of the financial statements (accuracy). We compute several basic network statistics and check, with the Louvain community detection algorithm, for emerging communities of banks. Remarkably enough sensible regional aggregations show up with the Japanese and the US clusters dominating the community structure, although the presence of a geographically mixed community points to a gradual convergence of banks into similar supranational practices. Finally, a Principal Component Analysis procedure reveals the main economic components that influence communities’ heterogeneity. Even using the most basic vector similarity hypotheses on the composition of the financial statements, the signature of the financial crisis clearly arises across the years around 2008. We finally discuss how the Accounting Networks can be improved to reflect the best practices in the financial statement analysis.Michelangelo Puligamichelangelo.puliga@imtlucca.itAndrea Floriandrea.flori@imtlucca.itGiuseppe PappalardoAlessandro Chessaalessandro.chessa@imtlucca.itFabio Pammolli2017-02-01T08:36:30Z2017-02-01T08:36:30Zhttp://eprints.imtlucca.it/id/eprint/3651This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36512017-02-01T08:36:30ZMRI-TRUS Image Synthesis with Application to Image-Guided Prostate InterventionAccurate and robust fusion of pre-procedure magnetic resonance imaging (MRI) to intra-procedure trans-rectal ultrasound (TRUS) imaging is necessary for image-guided prostate cancer biopsy procedures. The current clinical standard for image fusion relies on non-rigid surface-based registration between semi-automatically segmented prostate surfaces in both the MRI and TRUS. This surface-based registration method does not take advantage of internal anatomical prostate structures, which have the potential to provide useful information for image registration. However, non-rigid, multi-modal intensity-based MRI-TRUS registration is challenging due to highly non-linear intensities relationships between MRI and TRUS. In this paper, we present preliminary work using image synthesis to cast this problem into a mono-modal registration task by using a large database of over 100 clinical MRI-TRUS image pairs to learn a joint model of MR-TRUS appearance. Thus, given an MRI, we use this learned joint appearance model to synthesize the patient’s corresponding TRUS image appearance with which we could potentially perform mono-modal intensity-based registration. We present preliminary results of this approach.John A. OnofreyIlkay Oksuzilkay.oksuz@imtlucca.itSaradwata SarkarRajesh VenkataramanLawrence H. StaibXenophon Papademetris2017-01-26T14:46:16Z2017-01-26T14:46:16Zhttp://eprints.imtlucca.it/id/eprint/3645This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36452017-01-26T14:46:16ZStochastic gradient methods for stochastic model predictive controlWe introduce a new stochastic gradient algorithm, SAAGA, and investigate its employment for solving Stochastic MPC problems and multi-stage stochastic optimization programs in general. The method is particularly attractive for scenario-based formulations that involve a large number of scenarios, for which “batch” formulations may become inefficient due to high computational costs. Benefits of the method include cheap computations per iteration and fast convergence due to the sparsity of the proposed problem decomposition.A. ThemelisS. VillaPanagiotis PatrinosAlberto Bemporadalberto.bemporad@imtlucca.it2017-01-26T14:17:52Z2017-01-26T14:17:52Zhttp://eprints.imtlucca.it/id/eprint/3641This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36412017-01-26T14:17:52ZGPU-accelerated stochastic predictive control of drinking water networksAjay Kumar SampathiraoPantelis SopasakisAlberto Bemporadalberto.bemporad@imtlucca.itPanagiotis Patrinos2017-01-24T13:14:41Z2017-01-24T13:14:41Zhttp://eprints.imtlucca.it/id/eprint/3637This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36372017-01-24T13:14:41ZFrom linear to nonlinear MPC: bridging the gap via the real-time iterationLinear model predictive control (MPC) can be currently deployed at outstanding speeds, thanks to recent progress in algorithms for solving online the underlying structured quadratic programs. In contrast, nonlinear MPC (NMPC) requires the deployment of more elaborate algorithms, which require longer computation times than linear MPC. Nonetheless, computational speeds for NMPC comparable to those of MPC are now regularly reported, provided that the adequate algorithms are used. In this paper, we aim at clarifying the similarities and differences between linear MPC and NMPC. In particular, we focus our analysis on NMPC based on the real-time iteration (RTI) scheme, as this technique has been successfully tested and, in some applications, requires computational times that are only marginally larger than linear MPC. The goal of the paper is to promote the understanding of RTI-based NMPC within the linear MPC community.Sébastien GrosMario ZanonRien QuirynenAlberto Bemporadalberto.bemporad@imtlucca.itMoritz Diehl2017-01-24T13:10:56Z2017-01-24T13:16:05Zhttp://eprints.imtlucca.it/id/eprint/3636This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36362017-01-24T13:10:56ZA Lyapunov method for stability analysis of piecewise-affine systems over non-invariant domainsThis paper analyses stability of discrete-time piecewise-affine systems, defined on possibly non-invariant domains, taking into account the possible presence of multiple dynamics in each of the polytopic regions of the system. An algorithm based on linear programming is proposed, in order to prove exponential stability of the origin and to find a positively invariant estimate of its region of attraction. The results are based on the definition of a piecewise-affine Lyapunov function, which is in general discontinuous on the boundaries of the regions. The proposed method is proven to lead to feasible solutions in a broader range of cases as compared to a previously proposed approach. Two numerical examples are shown, among which a case where the proposed method is applied to a closed-loop system, to which model predictive control was applied without a-priori guarantee of stability.Matteo RubagottiLuca ZaccarianAlberto Bemporadalberto.bemporad@imtlucca.it2017-01-09T10:13:36Z2017-08-04T10:18:34Zhttp://eprints.imtlucca.it/id/eprint/3624This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36242017-01-09T10:13:36ZProgression from Vegetative to Minimally Conscious State Is Associated with Changes in Brain Neural Response to Passive Tasks: A Longitudinal Single-Case Functional MRI StudyObjectives: Functional magnetic resonance imaging (fMRI) may be adopted as a complementary tool for bedside observation in the disorders of consciousness (DOC). However, the diagnostic value of this technique is still debated because of the lack of accuracy in determining levels of consciousness within a single patient. Recently, Giacino and colleagues (2014) hypothesized that a longitudinal fMRI evaluation may provide a more informative assessment in the detection of residual awareness. The aim of this study was to measure the correspondence between clinically defined level of awareness and neural responses within a single DOC patient. Methods: We used a follow-up fMRI design in combination with a passive speech-processing task. Patient’s consciousness was measured through time by using the Coma Recovery Scale. Results: The patient progressed from a vegetative state (VS) to a minimally conscious state (MCS). Patient’s task-related neural responses mirrored the clinical change from a VS to an MCS. Specifically, while in an MCS, but not a VS, the patient showed a selective recruitment of the left angular gyrus when he listened to a native speech narrative, as compared to the reverse presentation of the same stimulus. Furthermore, the patient showed an increased response in the language-related brain network and a greater deactivation in the default mode network following his progression to an MCS. Conclusions: Our findings indicate that longitudinal assessment of brain responses to passive stimuli can contribute to the definition of the clinical status in individual patients with DOC and represents an adequate counterpart of the bedside assessment during the diagnostic decision-making process. (JINS, 2016, 22, 620–630)Francesco TomaiuoloLuca Cecchettiluca.cecchetti@imtlucca.itRaechelle M. GibsonFiammetta LogiAdrian M. OwenFranco MalasomaSabino CozzaPietro Pietrinipietro.pietrini@imtlucca.itEmiliano Ricciardiemiliano.ricciardi@imtlucca.it2017-01-09T10:10:50Z2017-01-09T10:10:50Zhttp://eprints.imtlucca.it/id/eprint/3623This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36232017-01-09T10:10:50ZAn unusual case of acquired pedophilic behavior following compression of orbitofrontal cortex and hypothalamus by a Clivus ChordomaGiuseppe SartoriCristina ScarpazzaSara CodognottoPietro Pietrinipietro.pietrini@imtlucca.it2017-01-09T10:03:28Z2017-01-09T10:03:28Zhttp://eprints.imtlucca.it/id/eprint/3622This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36222017-01-09T10:03:28ZSevere Psychopathy May Deserve Special ConsiderationStefano FerracutiPietro Pietrinipietro.pietrini@imtlucca.it2017-01-09T09:58:32Z2017-01-09T09:58:32Zhttp://eprints.imtlucca.it/id/eprint/3621This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36212017-01-09T09:58:32ZSponsorship bias in the comparative efficacy of psychotherapy and pharmacotherapy for adult depression: meta-analysisIoana A. CristeaClaudio GentiliPietro Pietrinipietro.pietrini@imtlucca.itPim Cuijpers2017-01-09T09:34:59Z2018-02-28T11:40:40Zhttp://eprints.imtlucca.it/id/eprint/3619This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36192017-01-09T09:34:59ZHarm aversion explains utilitarian choices in moral decision-making in males but not in femalesIn recent years, a great deal of research has relied on hypothetical sacrificial dilemmas to investigate decision-making processes involved in pro-social utilitarian choices. Recent evidence, however, has suggested that moral sacrificial choices may actually reflect reduced harm aversion and antisocial dispositions rather than an utilitarian inclination. Here, we used moral dilemmas to confront healthy volunteers with controversial action choices. We measured impulsiveness and venturesomeness personality traits, which have been shown to influence harm aversion, to test their role in utilitarian action and evaluation of moral acceptability. The results of the present study show that, in males, venturesomeness drives engagement in actions and increases moral acceptability. In contrast, in females no effects of venturesomeness were observed on moral action and evaluation. Rather, in females empathetic concern and personal distress, elicited by the vicarious experience of the other’s emotional states, exerted an inhibitory effect on action. Taken together, these findings indicate that the “harm aversion hypothesis” may contribute to explain utilitarian choices in males but not in females. In both genders, no association was observed between impulsiveness and moral action.Giuseppina RotaS. PalumboNicola Lattanzinicola.lattanzi@imtlucca.itA. ManfrinatiM. SarloL. LottoPietro Pietrinipietro.pietrini@imtlucca.itR. RumiatiSilvia Pellegrini2016-12-27T09:12:55Z2016-12-27T09:12:55Zhttp://eprints.imtlucca.it/id/eprint/3611This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36112016-12-27T09:12:55ZA computational framework for the interplay between delamination and wrinkling in functionally graded thermal barrier coatingsStiff films bonded to compliant substrates are used in a wide range of technological applications and especially in thermal barrier coatings (TBC). Thin films can be made of Functionally Graded Materials (FGMs) with a heterogeneous composition that usually range from a metallic to a ceramic phase. Aiming at investigating the phenomenon of delamination of thin FGM layers from compressed elastic substrates, a fully 3D nonlinear computational framework combining nonlinear fracture mechanics based on a novel interface element formulation for large displacements and a solid shell finite element to model the thin film is proposed. A comprehensive numerical analysis of delamination in TBCs is carried out, paying a special attention to the interplay between fracture and wrinkling instabilities. Results of the computations are also compared with benchmark 2D semi-analytical results, showing good accuracy of the proposed method that can be applied to general 3D configurations that are difficult to address by semi-analytical approaches.José ReinosoMarco Paggimarco.paggi@imtlucca.itRaimund Rolfes2016-12-27T09:12:48Z2016-12-27T09:12:48Zhttp://eprints.imtlucca.it/id/eprint/3612This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36122016-12-27T09:12:48ZA finite element framework for the interplay between delamination and buckling of rubber-like bi-material systems and stretchable electronicsIn this study, a finite element (FE) framework for the analysis of the interplay between buckling and delamination of thin layers bonded to soft substrates is proposed. The current framework incorporates the following modeling features: (i) geometrically nonlinear solid shell elements, (ii) geometrically nonlinear cohesive interface elements, and (iii) hyperelastic material constitutive response for the bodies that compose the system. A fully implicit Newton–Raphson solution strategy is adopted to deal with the complex simultaneous presence of geometrical and material nonlinearities through the derivation of the consistent FE formulation. Applications to a rubber-like bi-material system under finite bending and to patterned stiff islands resting on soft substrate for stretchable solar cells subjected to tensile loading are proposed. The results obtained are in good agreement with benchmark results available in the literature, confirming the accuracy and the capabilities of the proposed numerical method for the analysis of complex three-dimensional fracture mechanics problems under finite deformations.José ReinosoMarco Paggimarco.paggi@imtlucca.itP Areias2016-12-27T09:12:41Z2016-12-27T09:12:41Zhttp://eprints.imtlucca.it/id/eprint/3613This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36132016-12-27T09:12:41ZA geometrical multi-scale numerical method for coupled hygro-thermo-mechanical problems in photovoltaic laminatesA comprehensive computational framework based on the finite element method for the simulation of coupled hygro-thermo-mechanical problems in photovoltaic laminates is herein proposed. While the thermo-mechanical problem takes place in the three-dimensional space of the laminate, moisture diffusion occurs in a two-dimensional domain represented by the polymeric layers and by the vertical channel cracks in the solar cells. Therefore, a geometrical multi-scale solution strategy is pursued by solving the partial differential equations governing heat transfer and thermo-elasticity in the three-dimensional space, and the partial differential equation for moisture diffusion in the two dimensional domains. By exploiting a staggered scheme, the thermo-mechanical problem is solved first via a fully implicit solution scheme in space and time, with a specific treatment of the polymeric layers as zero-thickness interfaces whose constitutive response is governed by a novel thermo-visco-elastic cohesive zone model based on fractional calculus. Temperature and relative displacements along the domains where moisture diffusion takes place are then projected to the finite element model of diffusion, coupled with the thermo-mechanical problem by the temperature and crack opening dependent diffusion coefficient. The application of the proposed method to photovoltaic modules pinpoints two important physical aspects: (i) moisture diffusion in humidity freeze tests with a temperature dependent diffusivity is a much slower process than in the case of a constant diffusion coefficient; (ii) channel cracks through Silicon solar cells significantly enhance moisture diffusion and electric degradation, as confirmed by experimental tests.Pietro Lenardapietro.lenarda@imtlucca.itMarco Paggimarco.paggi@imtlucca.it2016-12-27T09:12:31Z2016-12-27T09:12:31Zhttp://eprints.imtlucca.it/id/eprint/3614This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36142016-12-27T09:12:31ZA global/local approach for the prediction of the electric response of cracked solar cells in photovoltaic modules under the action of mechanical loadsA numerical approach based on the finite element method to assess the impact of cracks in Silicon solar cells on the electric response of photovoltaic modules is proposed. A global coarse-scale finite element model of the composite laminate is used for carrying out the structural analysis. The computed displacements at the edges of each solar cell are passed via a projection scheme as boundary conditions to a 3D local fine-scale finite element model of the cells which accounts for cohesive cracks. The evaluated crack opening displacements along the crack faces are finally used as input to an electric model characterizing the grid line/solar cell ensemble. The identification of the relation between the localized electric resistance due to cracks and the crack opening, to be used as a constitutive model of cracks, is finally discussed in reference to experimental tests performed in the laboratory.Marco Paggimarco.paggi@imtlucca.itMauro Corradomauro.corrado@polito.itIrene Berardoneirene.berardone@polito.it2016-12-27T09:12:26Z2016-12-27T09:12:26Zhttp://eprints.imtlucca.it/id/eprint/3615This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36152016-12-27T09:12:26ZA two parameter elasto-plastic formulation for hardening pressure-dependent materialsA new pressure-dependent yield function is proposed by introducing a plastic Poisson's ratio within the theoretical formulation of the plastic potential. In analogy with other classical models, an equivalent stress and an equivalent plastic strain increment are defined. Then, according to these definitions, the equivalent stress–strain curve is derived and an exponential hardening law is introduced. The advantage of the proposed formulation over alternative approaches relies in explicit closed-form expressions of the flow rules and of the plastic multiplier.Valerio Carollovalerio.carollo@imtlucca.itMarco Paggimarco.paggi@imtlucca.itAlberto Rossani2016-12-27T09:12:15Z2016-12-27T09:12:15Zhttp://eprints.imtlucca.it/id/eprint/3616This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36162016-12-27T09:12:15ZAdhesive behaviour of bonded paper layers: Mechanical testing and statistical modellingIn this study, an experimental methodology based on micromechanical testing inside a scanning electron microscope is proposed to characterise bonding of paper layers connected by wet pressing. The peeling force–displacement evolution law that characterises the delamination of micromechanical double cantilever beam specimens of paper tissue have been extracted from such peeling tests. It is observed that the force–displacement evolution curve achieves a steady-state value related to the effective adhesive energy of the interface. This behaviour is explained by examining the complex load transfer mechanism between the layers exerted by cellulose fibrils. A statistical approach is used for the computation of the effective adhesive energy. It is argued that the observed force–displacement evolution law may be satisfactory described by a stochastic model that depends on the distribution function of the fibrils strength, and on two geometrical distribution functions related to the in-plane and out-of-plane fibrils angles with respect to the undeformed interface configuration. Some applications of the proposed model are demonstrated on examples.Claudia Borriclaudia.borri@imtlucca.itMarco Paggimarco.paggi@imtlucca.itJosé ReinosoFeodor M Borodich2016-12-27T09:06:58Z2016-12-27T09:06:58Zhttp://eprints.imtlucca.it/id/eprint/3617This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36172016-12-27T09:06:58ZMechanical properties of Graphene: Molecular dynamics simulations correlated to continuum based scaling lawsIn this paper, the combined effect of domain size, lattice orientation and crack length on the mechanical properties of Graphene, namely the yield strength and strain, are studied extensively based on molecular dynamics simulations. Numerical predictions are compared with the continuum-based laws of size effect and multifractal scaling. The yield strength is found to vary with the specimen size as ≈L^{−1/3}, which is in agreement with the multifractal scaling law, and with the inverse square of the initial crack length as ≈a^{-1/2}, according to the Griffith’s energy criterion for fracture.Brahmanandam JavvajiPattabhi R. Budarapupattabhi.budarapu@imtlucca.itV. K. SutrakarD. Roy MahapatraMarco Paggimarco.paggi@imtlucca.itGoangseup ZiTimon Rabczuk2016-12-27T09:05:28Z2016-12-27T09:05:28Zhttp://eprints.imtlucca.it/id/eprint/3610This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36102016-12-27T09:05:28ZA 3D coupled thermo-visco-elastic shear-lag formulation for the prediction of residual stresses in photovoltaic modules after laminationEvaluation of the residual stress distribution arising from lamination of photovoltaic (PV) modules is important to address thermomechanically induced failure of PV modules during service. In view of the fact that PV modules contain several Silicon cells, modelling the thermo-mechanical response of PV laminates during cooling after lamination is computationally challenging. Due to the coupling between the thermal and the mechanical fields, the stress state experienced by each silicon cell in a module varies from one position to another. Here, a novel 3D coupled thermo-visco-elastic shear-lag model is proposed to determine the stress distribution in a PV module after lamination. To enhance the prediction of stress distribution in the laminate, viscoelastic properties of the EVA encapsulant are taken into account by using an asymptotic model which is stable for small and large time steps of strain increments. The results for a simulated mini-module show that residual stresses vary significantly from point to point inside the PV module.Saheed Olalekan Ojosaheed.ojo@imtlucca.itMarco Paggimarco.paggi@imtlucca.it2016-12-22T10:10:55Z2017-01-13T10:08:17Zhttp://eprints.imtlucca.it/id/eprint/3609This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36092016-12-22T10:10:55ZTax Morale, Aversion to Ethnic Diversity, and DecentralizationThis paper analyzes the relationship between individuals' aversion to ethnic diversity, the degree of fiscal and political decentralization, and tax morale. Our theory is based on the assumption that individuals are risk averse in contributing to the provision of public goods benefiting other ethnic groups, and threfore display a lower tax morale. We find scope for policy intervention-specifically, our model predicts that the effect of individuals' aversion to ethnic diversity on taxmorale is smaller or null in decentralized political and fiscal systems relative to centralized ones. The theory highlights the role of decentralization reforms to cut down inter-ethnic redistribution in con icting environments. We test our results by using individual data from the World Value Survey, and several decentralization measures from Fan et al. (2009). According to our most preferred estimation, a one-scale change in the attitude toward ethnic diversity reduces tax morale of 0.03 in centralized system. We rather find no impact in decentralized states.Alessandro Belmontealessandro.belmonte@imtlucca.itRoberto Dell'AnnoDésirée Teobaldelli2016-12-13T15:24:29Z2016-12-13T15:24:29Zhttp://eprints.imtlucca.it/id/eprint/3608This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36082016-12-13T15:24:29ZPeer-Group Detection of Banks and Resilience to DistressThe paper looks at the importance of the true business model in shaping the risk profile of
financial institutions. We adopt a novel indirect clustering approach to enrich the classic bank
business model classification on a global data set including about 11,000 banks, both listed and
non-listed representing more than 180 countries over the period 2005-2014. A comprehensive
list of global distress events, which combines bankruptcies, liquidations, defaults, distressed
mergers, and public bailouts, is regressed against financial statement ratios (i.e. proxies for
CAMELS) and controlling for macro and sectoral effects using a rare-event logit model. Our
findings suggest that individual characteristics along with macro and sectoral factors contribute
differently, sometimes with opposite sign, to the likelihood of distress and to the volatility
of business models with the exception of liquidity whose contribution appears exogenous to
business model choice. By capturing the switching behaviour across groups, we find that
business model volatility exacerbates vulnerability and distress, especially if moving from
wholesale-oriented to deposit oriented groups.Andrea Floriandrea.flori@imtlucca.itSimone GiansanteFabio Pammollif.pammolli@imtlucca.it2016-11-30T10:37:34Z2016-11-30T10:37:34Zhttp://eprints.imtlucca.it/id/eprint/3606This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36062016-11-30T10:37:34ZMachine Learning for Plant Phenotyping Needs Image ProcessingWe found the article by Singh et al. [1] extremely interesting because it introduces and showcases the utility of machine learning for high-throughput data-driven plant phenotyping. With this letter we aim to emphasize the role that image analysis and processing have in the phenotyping pipeline beyond what is suggested in [1], both in analyzing phenotyping data (e.g., to measure growth) and when providing effective feature extraction to be used by machine learning. Key recent reviews have shown that it is image analysis itself (what the authors of [1] consider as part of pre-processing) that has brought a renaissance in phenotyping [2].Sotirios A. TsaftarisMassimo Minervinimassimo.minervini@imtlucca.itHanno Scharr2016-11-28T17:18:28Z2017-08-04T10:15:38Zhttp://eprints.imtlucca.it/id/eprint/3605This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36052016-11-28T17:18:28ZAre Supramodality and Cross-Modal Plasticity the Yin and Yang of Brain Development? From Blindness to RehabilitationResearch in blind individuals has primarily focused for a long time on the brain plastic reorganization that occurs in early visual areas. Only more recently, scientists have developed innovative strategies to understand to what extent vision is truly a mandatory prerequisite for the brain’s fine morphological architecture to develop and function. As a whole, the studies conducted to date in sighted and congenitally blind individuals have provided ample evidence that several ‘visual’ cortical areas develop independently from visual experience and do process information content regardless of the sensory modality through which a particular stimulus is conveyed: a property named supramodality. At the same time, lack of vision leads to a structural and functional reorganization within 'visual' brain areas, a phenomenon known as cross-modal plasticity. Cross-modal recruitment of the occipital cortex in visually deprived individuals represents an adaptative compensatory mechanism that mediates processing of non-visual inputs. Supramodality and cross-modal plasticity appear to be the 'yin and yang' of brain development: supramodal is what takes place despite the lack of vision, whereas cross-modal is what happens because of lack of vision. Here we provide a critical overview of the research in this field and discuss the implications that these novel findings have for the development of educative/rehabilitation approaches and sensory substitution devices in sensory-impaired individuals.Luca Cecchettiluca.cecchetti@imtlucca.itRon KupersMaurice PtitoPietro Pietrinipietro.pietrini@imtlucca.itEmiliano Ricciardiemiliano.ricciardi@imtlucca.it2016-11-28T17:15:33Z2017-08-04T11:53:03Zhttp://eprints.imtlucca.it/id/eprint/3604This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36042016-11-28T17:15:33ZWhen Neuroscience ‘Touches’ Architecture: From Hapticity to a Supramodal Functioning of the Human BrainIn the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting ‘neuro-architecture’ as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people–environment relationships, and even provide empirical foundations for a renewed evidence-based design theory.Paolo PapaleLeonardo ChiesiAlessandra Cecilia Rampininialessandra.rampinini@imtlucca.itPietro Pietrinipietro.pietrini@imtlucca.itEmiliano Ricciardiemiliano.ricciardi@imtlucca.it2016-11-28T17:08:03Z2016-11-28T17:08:03Zhttp://eprints.imtlucca.it/id/eprint/3603This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36032016-11-28T17:08:03Z(Review article) Sponsorship bias in the comparative efficacy of psychotherapy and pharmacotherapy for adult depression: meta-analysisBackground:Sponsorship bias has never been investigated for non-pharmacological treatments like psychotherapy.AimsWe examined industry funding and author financial conflict of interest (COI) in randomised controlled trials directly comparing psychotherapy and pharmacotherapy in depression.Method: We conducted a meta-analysis with subgroup comparisons for industry v. non-industry-funded trials, and respectively for trial reports with author financial COI v. those without.Results: In total, 45 studies were included. In most analyses, pharmacotherapy consistently showed significant effectiveness over psychotherapy, g = -0.11 (95% CI -0.21 to -0.02) in industry-funded trials. Differences between industry and non-industry-funded trials were significant, a result only partly confirmed in sensitivity analyses. We identified five instances where authors of the original article had not reported financial COI.ConclusionsIndustry-funded trials for depression appear to subtly favour pharmacotherapy over psychotherapy. Disclosure of all financial ties with the pharmaceutical industry should be encouraged.Ioana A. CristeaClaudio GentiliPietro Pietrinipietro.pietrini@imtlucca.itPim Cuijpers2016-11-28T16:31:33Z2017-09-26T08:08:44Zhttp://eprints.imtlucca.it/id/eprint/3602This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/36022016-11-28T16:31:33ZOsservazioni sull’inconscio otticoThis essay is based on an analysis of the notion of “Optical Unconscious” by Walter Benjamin.
It seeks to present an interpretation of this notion in connection with the historical relationship between the birth of cinematic technology on the one hand and research conducted during the same period in the field of physiology investigating human and animal movement on the other hand.
To this end the essay analyzes the following three concepts: (a) alienation, (b) automatism and (c) invisibility.
(a) In "Minutiae, Close-up, Microanalysis", Carlo Ginzburg formulates an analogy to describe the “optical unconscious” and juxtaposes it with a page from Marcel Proust in which the alien gaze of the narrator parallels the imperturbable lens of a camera and he experiences the physiognomy of the objects in their anonymous being.
Through reference to this passage, I seek to prove that the meaning of the photographic image does not reside in its ability to reflect its object as something real and familiar, but rather in its ability to alienate this object and make it foreign to the observer.
(b) This paper will link this impersonality of the subject to the concept of automatism and analyze this link through William K.L. Dickson’s "Kinetoscopic Record of a Sneeze" (1894) as an example of the many images of the time that depicted an involuntary movement on the part of the represented subject, namely an action or series of actions beyond the subject’s control.
I assert that this idea of displaying the ordinariness of an involuntary action constitutes a specificity that both photographic and cinematographic technology are based on.
(c) The comprehensive meaning of the represented subject therefore depends on the device, otherwise it would have been doomed to invisibility. In order to clarify what kind of invisibility is at stake here, my study takes a step back to examine the historical origins of the photo-cinematographic tools used in experimental physiology and the role that representation came to have (the idea of the autonomy of the representation).
Finally, in order to clarify these hypotheses, this essay analyzes the case of the French physiologist Étienne-Jules Marey.
By studying physiological theories on motion at the end of XIX Century, the article seeks to bring the relationship between photography and cinema back to its historical origins and highlight moments of intersection.Linda Bertellilinda.bertelli@imtlucca.it2016-11-22T12:23:40Z2016-11-22T12:23:40Zhttp://eprints.imtlucca.it/id/eprint/3599This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35992016-11-22T12:23:40ZNovel biodegradable nanocarriers for enhanced drug deliveryWith the refinement of functional properties, the interest around biodegradable materials, in biorelated applications and, in particular, in their use as controlled drug-delivery systems, increased in the last decades. Biodegradable materials are an ideal platform to obtain nanoparticles for spatiotemporal controlled drug delivery for the in vivo administration, thanks to their biocompatibility, functionalizability, the control exerted on delivery rates and the complete degradation. Their application in systems for cancer treatment, brain and cardiovascular diseases is already a consolidated practice in research, while the bench-to-bedside translation is still late. This review aims at summarizing reported applications of biodegradable materials to obtain drug-delivery nanoparticles in the last few years, giving a complete overview of pros and cons related to degradable nanomedicaments.Mariacristina Gagliardimariacristina.gagliardi@imtlucca.it2016-11-21T12:16:45Z2016-11-21T12:16:45Zhttp://eprints.imtlucca.it/id/eprint/3598This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35982016-11-21T12:16:45ZAssessing financial distress dependencies in OTC markets: a new approach using trade repositories dataIn this paper, we study the relationships among financial market sub-segments as a way to identify potential financial distress through increased co-movements among them. To study how sub-markets are mutually co-dependent, we combine granular data on over-the-counter derivatives by trade repositories and the joint probability of distress (JPoD) approach introduced by the International Monetary Fund. We define an indicator that combines several distress drivers and observe that results on co-dependencies are similar to those that would be expected: similarities between financial and contractual terms seem to be responsible for stronger co-movements among sub-markets. However, high values for JPoD even in correspondence of quite dissimilar sub-markets suggest the presence of other drivers that should be investigated in future research. To the best of our knowledge, this is the first empirical study on systemic risk assessment based on micro-founded trade repositories’ data on interest rate swaps.Michele Bonollomichele.bonollo@imtlucca.itIrene Crimaldiirene.crimaldi@imtlucca.itAndrea Floriandrea.flori@imtlucca.itLaura Gianfagnalaura.gianfagna@imtlucca.itFabio Pammollif.pammolli@imtlucca.it2016-11-14T11:56:25Z2016-11-14T11:56:25Zhttp://eprints.imtlucca.it/id/eprint/3597This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35972016-11-14T11:56:25ZCentral limit theorems for a hypergeometric randomly reinforced urnWe consider a variant of the randomly reinforced urn where more balls can be simultaneously drawn out and balls of different colors can be simultaneously added. More precisely, at each time-step, the conditional distribution of the number of extracted balls of a certain color given the past is assumed to be hypergeometric. We prove some central limit theorems in the sense of stable convergence and of almost sure conditional convergence, which are stronger than convergence in distribution. The proven results provide asymptotic confidence intervals for the limit proportion, whose distribution is generally unknown. Moreover, we also consider the case of more urns subjected to some random common factors.Irene Crimaldiirene.crimaldi@imtlucca.it2016-11-14T11:24:12Z2016-11-14T11:24:12Zhttp://eprints.imtlucca.it/id/eprint/3595This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35952016-11-14T11:24:12ZWhole Image Synthesis Using a Deep Encoder-Decoder NetworkThe synthesis of medical images is an intensity transformation of a given modality in a way that represents an acquisition with a different modality (in the context of MRI this represents the synthesis of images originating from different MR sequences). Most methods follow a patch-based approach, which is computationally inefficient during synthesis and requires some sort of ‘fusion’ to synthesize a whole image from patch-level results. In this paper, we present a whole image synthesis approach that relies on deep neural networks. Our architecture resembles those of encoder-decoder networks, which aims to synthesize a source MRI modality to an other target MRI modality. The proposed method is computationally fast, it doesn’t require extensive amounts of memory, and produces comparable results to recent patch-based approaches.Vasileios SevetlidisMario Valerio Giuffridavalerio.giuffrida@imtlucca.itSotirios A. Tsaftaris2016-11-14T11:18:52Z2016-11-14T11:18:52Zhttp://eprints.imtlucca.it/id/eprint/3594This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35942016-11-14T11:18:52ZRotation-Invariant Restricted Boltzmann Machine Using Shared Gradient FiltersFinding suitable features has been an essential problem in computer vision. We focus on Restricted Boltzmann Machines (RBMs), which, despite their versatility, cannot accommodate transformations that may occur in the scene. As a result, several approaches have been proposed that consider a set of transformations, which are used to either augment the training set or transform the actual learned filters. In this paper, we propose the Explicit Rotation-Invariant Restricted Boltzmann Machine, which exploits prior information coming from the dominant orientation of images. Our model extends the standard RBM, by adding a suitable number of weight matrices, associated with each dominant gradient. We show that our approach is able to learn rotation-invariant features, comparing it with the classic formulation of RBM on the MNIST benchmark dataset. Overall, requiring less hidden units, our method learns compact features, which are robust to rotations.Mario Valerio Giuffridavalerio.giuffrida@imtlucca.itSotirios A. Tsaftaris2016-11-08T09:21:45Z2016-11-08T09:21:45Zhttp://eprints.imtlucca.it/id/eprint/3592This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35922016-11-08T09:21:45ZBridging or bonding? Preferences for redistribution and social capital in RussiaDoes bridging or bonding social capital matter for redistribution preferences? Existing literature demonstrates causal link between measures of social capital and such preferences but does it mostly for developed countries with good enforcement of formal rules and without a distinction between two completely different types of social capital. We argue that welfare state relies on contributions from an immense number of anonymous citizens, thus attitudes towards strangers, i.e. generalized trust and solidarity should be salient. Using two surveys of about 34,000 and 37,000 Russians we prove this proposition showing the importance of the bridging type but not the bonding one. Instrumenting social capital with education, climate and distance from Moscow we deal with endogeneity concerns. Additionally we claim that connection between social capital and redistribution preferences for less developed countries such as Russia could be similar to developed countries.Ekaterina BorisovaAndrei GovorunDenis Ivanov2016-10-28T10:29:23Z2016-10-28T10:29:23Zhttp://eprints.imtlucca.it/id/eprint/3591This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35912016-10-28T10:29:23ZLa notion d’Ornement dans les documents d’archives et les biographies d’artistes (XVIeme-XVIIIeme siècles)Emanuele Pellegriniemanuele.pellegrini@imtlucca.it2016-10-24T12:26:12Z2016-10-24T12:26:12Zhttp://eprints.imtlucca.it/id/eprint/3586This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35862016-10-24T12:26:12ZMicro- and Macrostructured PLGA/Gelatin Scaffolds Promote Early Cardiogenic Commitment of Human Mesenchymal Stem Cells In VitroThe biomaterial scaffold plays a key role in most tissue engineering strategies. Its surface properties, micropatterning, degradation, and mechanical features affect not only the generation of the tissue construct in vitro, but also its in vivo functionality. The area of myocardial tissue engineering still faces significant difficulties and challenges in the design of bioactive scaffolds, which allow composition variation to accommodate divergence in the evolving myocardial structure. Here we aimed at verifying if a microstructured bioartificial scaffold alone can provoke an effect on stem cell behavior. To this purpose, we fabricated microstructured bioartificial polymeric constructs made of PLGA/gelatin mimicking anisotropic structure and mechanical properties of the myocardium. We found that PLGA/gelatin scaffolds promoted adhesion, elongation, ordered disposition, and early myocardial commitment of human mesenchymal stem cells suggesting that these constructs are able to crosstalk with stem cells in a precise and controlled manner. At the same time, the biomaterial degradation kinetics renders the PLGA/gelatin constructs very attractive for myocardial regeneration approaches.Caterina CristalliniElisa Cibrario RocchiettiMariacristina Gagliardimariacristina.gagliardi@imtlucca.itLeonardo MortatiSilvia SaviozziElena BellottiValentina TurinettoMaria Paola SassiNiccoletta BarbaniClaudia Giachino2016-10-10T15:59:21Z2016-10-10T15:59:21Zhttp://eprints.imtlucca.it/id/eprint/3585This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35852016-10-10T15:59:21ZDetecting early signs of the 2007–2008 crisis in the world tradeSince 2007, several contributions have tried to identify early-warning signals of the financial crisis. However, the vast majority of analyses has focused on financial systems and little theoretical work has been done on the economic counterpart. In the present paper we fill this gap and employ the theoretical tools of network theory to shed light on the response of world trade to the financial crisis of 2007 and the economic recession of 2008–2009. We have explored the evolution of the bipartite World Trade Web (WTW) across the years 1995–2010, monitoring the behavior of the system both before and after 2007. Our analysis shows early structural changes in the WTW topology: since 2003, the WTW becomes increasingly compatible with the picture of a network where correlations between countries and products are progressively lost. Moreover, the WTW structural modification can be considered as concluded in 2010, after a seemingly stationary phase of three years. We have also refined our analysis by considering specific subsets of countries and products: the most statistically significant early-warning signals are provided by the most volatile macrosectors, especially when measured on developing countries, suggesting the emerging economies as being the most sensitive ones to the global economic cycles.Fabio Saraccofabio.saracco@imtlucca.itRiccardo Di ClementeAndrea GabrielliTiziano Squartinitiziano.squartini@imtlucca.it2016-10-10T15:08:08Z2016-10-10T15:08:08Zhttp://eprints.imtlucca.it/id/eprint/3583This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35832016-10-10T15:08:08ZReal-time model predictive control based on dual gradient projection: Theory and fixed-point FPGA implementationThis paper proposes a method to design robust model predictive control (MPC) laws for discrete-time linear systems with hard mixed constraints on states and inputs, in case of only an inexact solution of the associated quadratic program is available, because of real-time requirements. By using a recently proposed dual gradient-projection algorithm, it is proved that the discrepancy of the optimal control law as compared with the obtained one is bounded even if the solver is implemented in fixed-point arithmetic. By defining an alternative MPC problem with tightened constraints, a feasible solution is obtained for the original MPC problem, which guarantees recursive feasibility and asymptotic stability of the closed-loop system with respect to a set including the origin, also considering the presence of external disturbances. The proposed MPC law is implemented on a field-programmable gate array in order to show the practical applicability of the method.Matteo RubagottiPanagiotis PatrinosAlberto GuiggianiAlberto Bemporadalberto.bemporad@imtlucca.it2016-10-07T07:53:34Z2017-07-18T09:47:17Zhttp://eprints.imtlucca.it/id/eprint/3582This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35822016-10-07T07:53:34ZFestivalization of fantasy culture and the growing phenomenon of comic-cons: Lucca Comics & GamesYesim Tonga Uriarteyesim.tonga@imtlucca.it2016-10-07T07:49:40Z2017-07-18T09:49:11Zhttp://eprints.imtlucca.it/id/eprint/3581This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35812016-10-07T07:49:40ZInvestigating Socio-Economic Impacts of Mega Events: The Case of Comic-ConsYesim Tonga Uriarteyesim.tonga@imtlucca.itTiziano Antognozzitiziano.antognozzi@imtlucca.itMaria Luisa Catonimarialuisa.catoni@imtlucca.it2016-10-06T15:07:07Z2016-10-06T15:07:07Zhttp://eprints.imtlucca.it/id/eprint/3576This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35762016-10-06T15:07:07ZDiffLQN: Differential Equation Analysis of Layered Queuing NetworksLayered queuing networks are a popular technique in software performance engineering. In this paper we present DiffLQN, a tool for the analysis of networks using ordinary differential equations. It estimates average performance indices such as throughput, utilization, and response time of software and hardware devices. The complexity of computing the solution is independent of the concurrency levels in the model (i.e., thread multiplicities and processing units) and the estimates are theoretically guaranteed to be asymptotically correct for large enough concurrency levels. DiffLQN is designed having in mind compatibility with other tools that support state-of-the-art methods based on mean value analysis.Tabea WaizmannMirco Tribastonemirco.tribastone@imtlucca.it2016-10-06T14:45:00Z2016-10-06T14:45:00Zhttp://eprints.imtlucca.it/id/eprint/3574This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35742016-10-06T14:45:00ZSymbolic Performance AdaptationQuality-of-Service attributes such as performance and reliability heavily depend on the run-time conditions under which software is executed (e.g., workload fluctuation and resources availability). Therefore, it is important to design systems able to adapt their setting and behavior due to these run-time variabilities. In this paper we propose a novel approach based on queuing networks as the quantitative model to represent system configurations. To find a model that fits with continuous changes in run-time conditions we rely on an innovative combination of symbolic analysis and satisfiability modulo theory (SMT). Through symbolic analysis we represent all possible system configurations as a set of nonlinear real constraints. By formulating an SMT problem we are able to devise feasible system configurations at a small computational cost. We study the effectiveness and scalability of our approach on a three-tier web system featuring different levels of redundancy.Emilio IncertoMirco Tribastonemirco.tribastone@imtlucca.itCatia Trubiani2016-10-06T14:39:06Z2016-10-06T14:39:06Zhttp://eprints.imtlucca.it/id/eprint/3573This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35732016-10-06T14:39:06ZWorkload Change Point Detection for Runtime Thermal Management of Embedded SystemsApplications executed on multicore embedded systems interact with system software [such as the operating system (OS)] and hardware, leading to widely varying thermal profiles which accelerate some aging mechanisms, reducing the lifetime reliability. Effectively managing the temperature therefore requires: 1) autonomous detection of changes in application workload and 2) appropriate selection of control levers to manage thermal profiles of these workloads. In this paper, we propose a technique for workload change detection using density ratio-based statistical divergence between overlapping sliding windows of CPU performance statistics. This is integrated in a runtime approach for thermal management, which uses reinforcement learning to select workload-specific thermal control levers by sampling on-board thermal sensors. Identified control levers override the OSs native thread allocation decision and scale hardware voltage-frequency to improve average temperature, peak temperature, and thermal cycling. The proposed approach is validated through its implementation as a hierarchical runtime manager for Linux, with heuristic-based thread affinity selected from the upper hierarchy to reduce thermal cycling and learningbased voltage-frequency selected from the lower hierarchy to reduce average and peak temperatures. Experiments conducted with mobile, embedded, and high performance applications on ARM-based embedded systems demonstrate that the proposed approach increases workload change detection accuracy by an average 3.4×, reducing the average temperature by 4 °C-25 °C, peak temperature by 6 °C-24 °C, and thermal cycling by 7%-35% over state-of-the-art approaches.Anup DasGeoff V. MerrettMirco Tribastonemirco.tribastone@imtlucca.itBashir M. Al-Hashimi2016-10-06T14:27:01Z2016-10-06T14:27:01Zhttp://eprints.imtlucca.it/id/eprint/3572This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35722016-10-06T14:27:01ZLearning from successes and failures in pharmaceutical R&DIn this paper, we build a cumulative innovation model to understand the role of both success and failure in the learning dynamics that characterize pharmaceutical R&D. We test the prediction of our model by means of a unique dataset that combines patent information with R&D projects, thus distinguishing patents related to successfully marketed products from those covering candidate drugs that failed in clinical trials. Results confirm model predictions showing that patents associated with successfully completed projects receive more citations than those associated with failed projects. However, we also show that failed projects can be in turn cited more often than patents lacking clinical or preclinical information. We further explore the `black box' of innovation, providing evidence that both successes and failures contribute to R&D investment decisions and knowledge dynamics in science-driven sectors.Jing-Yuan ChiouLaura MagazziniFabio Pammollif.pammolli@imtlucca.itMassimo Riccabonimassimo.riccaboni@imtlucca.it2016-10-06T09:30:02Z2016-10-06T09:30:02Zhttp://eprints.imtlucca.it/id/eprint/3563This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35632016-10-06T09:30:02ZOn the Power of Attribute-Based CommunicationIn open systems exhibiting adaptation, behaviors can arise as side effects of intensive components interaction. Finding ways to understand and design these systems, is a difficult but important endeavor. To tackle these issues, we present AbC, a calculus for attribute-based communication. An AbC system consists of a set of parallel agents each of which is equipped with a set of attributes. Communication takes place in an implicit multicast fashion, and interactions among agents are dynamically established by taking into account “connections” as determined by predicates over the attributes of agents. First, the syntax and the semantics of the calculus are presented, then expressiveness and effectiveness of AbC are demonstrated both in terms of modeling scenarios featuring collaboration, reconfiguration, and adaptation and of the possibility of encoding channel-based interactions and other interaction patterns. Behavioral equivalences for AbC are introduced for establishing formal relationships between different descriptions of the same system.Yehia Abd AlrahmanRocco De Nicolar.denicola@imtlucca.itMichele Loreti2016-10-06T09:24:59Z2016-10-06T09:24:59Zhttp://eprints.imtlucca.it/id/eprint/3562This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35622016-10-06T09:24:59ZDynamic SLAs for CloudsIn the Cloud domain, to guarantee adaptation to the needs of users and providers, Service-Level-Agreements (SLAs) would benefit from mechanisms to capture the dynamism of services. The existing SLA languages attempt to address this challenge by focusing on renegotiation of the agreement terms, which is a heavy-weight process, not really suitable for dealing with cloud dynamism. In this paper, we propose an extension of SLAC, a SLA language for clouds that we have recently defined, with a mechanism that enable dynamic modifications of the service agreement. We formally describe this extension, implement it in the SLAC framework and analyse the impacts of dynamic SLAs in some applications. The advantages of dynamic SLAs are demonstrated by comparing their effect with that of static SLA and of the “renegotiation” approach.Rafael Brundo UriarteFrancesco TiezziRocco De Nicolar.denicola@imtlucca.it2016-10-06T09:14:53Z2016-10-06T09:16:01Zhttp://eprints.imtlucca.it/id/eprint/3560This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35602016-10-06T09:14:53ZMultiparty Testing PreordersVariants of the must testing approach have been successfully applied in Service Oriented Computing for analysing the compliance between (contracts exposed by) clients and servers or, more generally, between two peers. It has however been argued that multiparty scenarios call for more permissive notions of compliance because partners usually do not have full coordination capabilities. We propose two new testing preorders, which are obtained by restricting the set of potential observers. For the first preorder, called uncoordinated, we allow only sets of parallel observers that use different parts of the interface of a given service and have no possibility of intercommunication. For the second preorder, that we call independent, we instead rely on parallel observers that perceive as silent all the actions that are not in the interface of interest. We have that the uncoordinated preorder is coarser than the classical must testing preorder and finer than the independent one. We also provide a characterisation in terms of decorated traces for both preorders: the uncoordinated preorder is defined in terms of must-sets and Mazurkiewicz traces while the independent one is described in terms of must-sets and classes of filtered traces that only contain designated visible actions.Rocco De Nicolar.denicola@imtlucca.itHernán Melgratti2016-10-05T13:17:08Z2017-03-27T11:08:21Zhttp://eprints.imtlucca.it/id/eprint/3559This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35592016-10-05T13:17:08ZHand synergies: Integration of robotics and neuroscience for understanding the control of biological and artificial hands.The term 'synergy' - from the Greek synergia - means 'working together'. The concept of multiple elements working together towards a common goal has been extensively used in neuroscience to develop theoretical frameworks, experimental approaches, and analytical techniques to understand neural control of movement, and for applications for neuro-rehabilitation. In the past decade, roboticists have successfully applied the framework of synergies to create novel design and control concepts for artificial hands, i.e., robotic hands and prostheses. At the same time, robotic research on the sensorimotor integration underlying the control and sensing of artificial hands has inspired new research approaches in neuroscience, and has provided useful instruments for novel experiments. The ambitious goal of integrating expertise and research approaches in robotics and neuroscience to study the properties and applications of the concept of synergies is generating a number of multidisciplinary cooperative projects, among which the recently finished 4-year European project "The Hand Embodied" (THE). This paper reviews the main insights provided by this framework. Specifically, we provide an overview of neuroscientific bases of hand synergies and introduce how robotics has leveraged the insights from neuroscience for innovative design in hardware and controllers for biomedical engineering applications, including myoelectric hand prostheses, devices for haptics research, and wearable sensing of human hand kinematics. The review also emphasizes how this multidisciplinary collaboration has generated new ways to conceptualize a synergy-based approach for robotics, and provides guidelines and principles for analyzing human behavior and synthesizing artificial robotic systems based on a theory of synergies.Marco SantelloMatteo BianchiMarco GabicciniEmiliano Ricciardiemiliano.ricciardi@imtlucca.itGionata SalviettiDomenico PrattichizzoMarc ErnstAlessandro MoscatelliHenrik JörntellAstrid M L KappersKostas KyriakopoulosAlin Albu-SchäfferClaudio CastelliniAntonio Bicchi2016-10-05T13:15:44Z2017-06-21T13:18:09Zhttp://eprints.imtlucca.it/id/eprint/3558This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35582016-10-05T13:15:44ZWhen Neuroscience 'Touches' Architecture: From Hapticity to a Supramodal Functioning of the Human Brain.In the last decades, the rapid growth of functional brain imaging methodologies allowed cognitive neuroscience to address open questions in philosophy and social sciences. At the same time, novel insights from cognitive neuroscience research have begun to influence various disciplines, leading to a turn to cognition and emotion in the fields of planning and architectural design. Since 2003, the Academy of Neuroscience for Architecture has been supporting 'neuro-architecture' as a way to connect neuroscience and the study of behavioral responses to the built environment. Among the many topics related to multisensory perceptual integration and embodiment, the concept of hapticity was recently introduced, suggesting a pivotal role of tactile perception and haptic imagery in architectural appraisal. Arguments have thus risen in favor of the existence of shared cognitive foundations between hapticity and the supramodal functional architecture of the human brain. Precisely, supramodality refers to the functional feature of defined brain regions to process and represent specific information content in a more abstract way, independently of the sensory modality conveying such information to the brain. Here, we highlight some commonalities and differences between the concepts of hapticity and supramodality according to the distinctive perspectives of architecture and cognitive neuroscience. This comparison and connection between these two different approaches may lead to novel observations in regard to people-environment relationships, and even provide empirical foundations for a renewed evidence-based design theory.Paolo PapaleLeonardo ChiesiAlessandra Cecilia Rampininialessandra.rampinini@imtlucca.itPietro Pietrinipietro.pietrini@imtlucca.itEmiliano Ricciardiemiliano.ricciardi@imtlucca.it2016-10-05T13:11:35Z2017-03-27T11:07:09Zhttp://eprints.imtlucca.it/id/eprint/3557This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35572016-10-05T13:11:35ZTowards a synergy framework across neuroscience and robotics: Lessons learned and open questions. Reply to comments on: "Hand synergies: Integration of robotics and neuroscience for understanding the control of biological and artificial hands".Marco SantelloMatteo BianchiMarco GabicciniEmiliano Ricciardiemiliano.ricciardi@imtlucca.itGionata SalviettiDomenico PrattichizzoMarc ErnstAlessandro MoscatelliHenrik JorntellAstrid M L KappersKostas KyriakopoulosAlin Abu SchaefferClaudio CastelliniAntonio Bicchi2016-10-04T12:24:38Z2016-10-04T12:24:38Zhttp://eprints.imtlucca.it/id/eprint/3555This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35552016-10-04T12:24:38ZCoupling News Sentiment with Web Browsing Data Improves Prediction of Intra-Day Price DynamicsThe new digital revolution of big data is deeply changing our capability of understanding society and forecasting the outcome of many social and economic systems. Unfortunately, information can be very heterogeneous in the importance, relevance, and surprise it conveys, affecting severely the predictive power of semantic and statistical methods. Here we show that the aggregation of web users’ behavior can be elicited to overcome this problem in a hard to predict complex system, namely the financial market. Specifically, our in-sample analysis shows that the combined use of sentiment analysis of news and browsing activity of users of Yahoo! Finance greatly helps forecasting intra-day and daily price changes of a set of 100 highly capitalized US stocks traded in the period 2012–2013. Sentiment analysis or browsing activity when taken alone have very small or no predictive power. Conversely, when considering a news signal where in a given time interval we compute the average sentiment of the clicked news, weighted by the number of clicks, we show that for nearly 50% of the companies such signal Granger-causes hourly price returns. Our result indicates a “wisdom-of-the-crowd” effect that allows to exploit users’ activity to identify and weigh properly the relevant and surprising news, enhancing considerably the forecasting power of the news sentiment.Wei-Xing ZhouGabriele Rancogabriele.ranco@imtlucca.itIlaria BordinoGiacomo BormettiGuido Caldarelliguido.caldarelli@imtlucca.itFabrizio LilloMichele Treccani2016-10-04T11:19:22Z2016-10-04T11:19:22Zhttp://eprints.imtlucca.it/id/eprint/3554This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35542016-10-04T11:19:22ZUsers Polarization on Facebook and YoutubeOn social media algorithms for content promotion, accounting for users preferences, might limit the exposure to unsolicited contents. In this work, we study how the same contents (videos) are consumed on different platforms -- i.e. Facebook and YouTube -- over a sample of 12M of users. Our findings show that the same content lead to the formation of echo chambers, irrespective of the online social network and thus of the algorithm for content promotion. Finally, we show that the users' commenting patterns are accurate early predictors for the formation of echo-chambers.Alessandro BessiFabiana Zollofabiana.zollo@imtlucca.itMichela Del Vicariomichela.delvicario@imtlucca.itMichelangelo Puligamichelangelo.puliga@imtlucca.itAntonio ScalaGuido Caldarelliguido.caldarelli@imtlucca.itBrian UzziWalter Quattrociocchiwalter.quattrociocchi@imtlucca.it2016-10-04T11:13:27Z2016-10-04T11:13:27Zhttp://eprints.imtlucca.it/id/eprint/3553This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35532016-10-04T11:13:27ZHierarchical organization of functional connectivity in the mouse brain: a complex network approachThis paper represents a contribution to the study of the brain functional connectivity from the perspective of complex networks theory. More specifically, we apply graph theoretical analyses to provide evidence of the modular structure of the mouse brain and to shed light on its hierarchical organization. We propose a novel percolation analysis and we apply our approach to the analysis of a resting-state functional MRI data set from 41 mice. This approach reveals a robust hierarchical structure of modules persistent across different subjects. Importantly, we test this approach against a statistical benchmark (or null model) which constrains only the distributions of empirical correlations. Our results unambiguously show that the hierarchical character of the mouse brain modular structure is not trivially encoded into this lower-order constraint. Finally, we investigate the modular structure of the mouse brain by computing the Minimal Spanning Forest, a technique that identifies subnetworks characterized by the strongest internal correlations. This approach represents a faster alternative to other community detection methods and provides a means to rank modules on the basis of the strength of their internal edges.Giampiero BardellaAngelo BifoneAndrea GabrielliAlessandro GozziTiziano Squartinitiziano.squartini@imtlucca.it2016-10-04T10:57:25Z2016-10-04T11:02:28Zhttp://eprints.imtlucca.it/id/eprint/3552This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35522016-10-04T10:57:25ZThe price of complexity in financial networksFinancial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises.Stefano BattistonGuido Caldarelliguido.caldarelli@imtlucca.itRobert M. MayTarik RouknyJoseph E. Stiglitz2016-10-04T10:46:24Z2016-10-04T10:46:24Zhttp://eprints.imtlucca.it/id/eprint/3551This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35512016-10-04T10:46:24ZStatistically validated network of portfolio overlaps and systemic riskCommon asset holding by financial institutions, namely portfolio overlap, is nowadays regarded as an important channel for financial contagion with the potential to trigger fire sales and thus severe losses at the systemic level. In this paper we propose a method to assess the statistical significance of the overlap between pairs of heterogeneously diversified portfolios, which then allows us to build a validated network of financial institutions where links indicate potential contagion channels due to realized portfolio overlaps. The method is implemented on a historical database of institutional holdings ranging from 1999 to the end of 2013, but can be in general applied to any bipartite network where the presence of similar sets of neighbors is of interest. We find that the proportion of validated network links (i.e., of statistically significant overlaps) increased steadily before the 2007-2008 global financial crisis and reached a maximum when the crisis occurred. We argue that the nature of this measure implies that systemic risk from fire sales liquidation was maximal at that time. After a sharp drop in 2008, systemic risk resumed its growth in 2009, with a notable acceleration in 2013, reaching levels not seen since 2007. We finally show that market trends tend to be amplified in the portfolios identified by the algorithm, such that it is possible to have an informative signal about financial institutions that are about to suffer (enjoy) the most significant losses (gains).Stanislao GualdiGiulio Ciminigiulio.cimini@imtlucca.itKevin PrimicerioRiccardo Di ClementeDamien Challet2016-10-04T10:38:55Z2016-10-04T10:38:55Zhttp://eprints.imtlucca.it/id/eprint/3550This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35502016-10-04T10:38:55ZPathways towards instability in financial networksThere is growing consensus that processes of market integration and risk diversification may come at the price of more systemic risk. Indeed, financial institutions are interconnected in a network of contracts where distress can either be amplified or dampened. However, a mathematical understanding of instability in relation to the network topology is still lacking. In a model financial network, we show that the origin of instability resides in the presence of specific types of cyclical structures, regardless of many of the details of the distress propagation mechanism. In particular, we show the existence of trajectories in the space of graphs along which a complex network turns from stable to unstable, although at each point along the trajectory its nodes satisfy constraints that would apparently make them individually stable. In the financial context, our findings have important implications for policies aimed at increasing financial stability. We illustrate the propositions on a sample dataset for the top 50 EU listed banks between 2008 and 2013. More in general, our results shed light on previous findings on the instability of model ecosystems and are relevant for a broad class of dynamical processes on complex networks.Marco BardosciaStefano BattistonFabio CaccioliGuido Caldarelliguido.caldarelli@imtlucca.it2016-10-04T10:25:49Z2016-10-04T10:25:49Zhttp://eprints.imtlucca.it/id/eprint/3549This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35492016-10-04T10:25:49ZCascades in interdependent flow networksIn this manuscript, we investigate the abrupt breakdown behavior of coupled distribution grids under load growth. This scenario mimics the ever-increasing customer demand and the foreseen introduction of energy hubs interconnecting the different energy vectors. We extend an analytical model of cascading behavior due to line overloads to the case of interdependent networks and find evidence of first order transitions due to the long-range nature of the flows. Our results indicate that the foreseen increase in the couplings between the grids has two competing effects: on the one hand, it increases the safety region where grids can operate without withstanding systemic failures; on the other hand, it increases the possibility of a joint systems’ failure.Antonio ScalaPier Giorgio De Sanctis LucentiniGuido Caldarelliguido.caldarelli@imtlucca.itGregorio D’Agostino2016-10-04T09:40:34Z2016-10-04T09:40:34Zhttp://eprints.imtlucca.it/id/eprint/3547This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35472016-10-04T09:40:34ZA hierarchical consensus method for the approximation of the consensus state, based on clustering and spectral graph theoryA hierarchical method for the approximate computation of the consensus state of a network of agents is investigated. The method is motivated theoretically by spectral graph theory arguments. In a first phase, the graph is divided into a number of subgraphs with good spectral properties, i.e., a fast convergence toward the local consensus state of each subgraph. To find the subgraphs, suitable clustering methods are used. Then, an auxiliary graph is considered, to determine the final approximation of the consensus state in the original network. A theoretical investigation is performed of cases for which the hierarchical consensus method has a better performance guarantee than the non-hierarchical one (i.e., it requires a smaller number of iterations to guarantee a desired accuracy in the approximation of the consensus state of the original network). Moreover, numerical results demonstrate the effectiveness of the hierarchical consensus method for several case studies modeling real-world networks.Rita MorisiGiorgio Gneccogiorgio.gnecco@imtlucca.itAlberto Bemporadalberto.bemporad@imtlucca.it2016-10-04T08:56:19Z2016-10-04T08:56:19Zhttp://eprints.imtlucca.it/id/eprint/3545This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35452016-10-04T08:56:19ZPiecewise affine regression via recursive multiple least squares and multicategory discriminationIn nonlinear regression choosing an adequate model structure is often a challenging problem. While simple models (such as linear functions) may not be able to capture the underlying relationship among the variables, over-parametrized models described by a large set of nonlinear basis functions tend to overfit the training data, leading to poor generalization on unseen data. Piecewise-affine (PWA) models can describe nonlinear and possible discontinuous relationships while maintaining simple local affine regressor-to-output mappings, with extreme flexibility when the polyhedral partitioning of the regressor space is learned from data rather than fixed a priori. In this paper, we propose a novel and numerically very efficient two-stage approach for {PWA} regression based on a combined use of (i) recursive multi-model least-squares techniques for clustering and fitting linear functions to data, and (ii) linear multi-category discrimination, either offline (batch) via a Newton-like algorithm for computing a solution of unconstrained optimization problems with objective functions having a piecewise smooth gradient, or online (recursive) via averaged stochastic gradient descent.Valentina BreschiDario Pigadario.piga@imtlucca.itAlberto Bemporadalberto.bemporad@imtlucca.it2016-10-04T08:36:36Z2016-10-04T08:36:36Zhttp://eprints.imtlucca.it/id/eprint/3543This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35432016-10-04T08:36:36ZFrom linear to nonlinear MPC: bridging the gap via the real-time iterationLinear model predictive control (MPC) can be currently deployed at outstanding speeds, thanks to recent progress in algorithms for solving online the underlying structured quadratic programs. In contrast, nonlinear MPC (NMPC) requires the deployment of more elaborate algorithms, which require longer computation times than linear MPC. Nonetheless, computational speeds for NMPC comparable to those of MPC are now regularly reported, provided that the adequate algorithms are used. In this paper, we aim at clarifying the similarities and differences between linear MPC and NMPC. In particular, we focus our analysis on NMPC based on the real-time iteration (RTI) scheme, as this technique has been successfully tested and, in some applications, requires computational times that are only marginally larger than linear MPC. The goal of the paper is to promote the understanding of RTI-based NMPC within the linear MPC community.Sébastien GrosMario ZanonRien QuirynenAlberto Bemporadalberto.bemporad@imtlucca.itMoritz Diehl2016-09-22T16:17:07Z2016-09-22T16:17:07Zhttp://eprints.imtlucca.it/id/eprint/3542This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35422016-09-22T16:17:07ZData Science and Complex Networks. Real Case Studies with PythonThis book provides a comprehensive yet short description of the basic concepts of Complex Network theory. In contrast to other books the authors present these concepts through real case studies. The application topics span from Foodwebs, to the Internet, the World Wide Web and the Social Networks, passing through the International Trade Web and Financial time series. The final part is devoted to definition and implementation of the most important network models.
The text provides information on the structure of the data and on the quality of available datasets. Furthermore it provides a series of codes to allow immediate implementation of what is theoretically described in the book. Readers already used to the concepts introduced in this book can learn the art of coding in Python by using the online material. To this purpose the authors have set up a dedicated web site where readers can download and test the codes. The whole project is aimed as a learning tool for scientists and practitioners, enabling them to begin working instantly in the field of Complex Networks.Guido Caldarelliguido.caldarelli@imtlucca.itAlessandro Chessaalessandro.chessa@imtlucca.it2016-09-16T08:51:39Z2016-09-16T10:26:17Zhttp://eprints.imtlucca.it/id/eprint/3541This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35412016-09-16T08:51:39ZFinely-grained annotated datasets for image-based plant phenotypingImage-based approaches to plant phenotyping are gaining momentum providing fertile ground for several interesting vision tasks where fine-grained categorization is necessary, such as leaf segmentation among a variety of cultivars, and cultivar (or mutant) identification. However, benchmark data focusing on typical imaging situations and vision tasks are still lacking, making it difficult to compare existing methodologies. This paper describes a collection of benchmark datasets of raw and annotated top-view color images of rosette plants. We briefly describe plant material, imaging setup and procedures for different experiments: one with various cultivars of Arabidopsis and one with tobacco undergoing different treatments. We proceed to define a set of computer vision and classification tasks and provide accompanying datasets and annotations based on our raw data. We describe the annotation process performed by experts and discuss appropriate evaluation criteria. We also offer exemplary use cases and results on some tasks obtained with parts of these data. We hope with the release of this rigorous dataset collection to invigorate the development of algorithms in the context of plant phenotyping but also provide new interesting datasets for the general computer vision community to experiment on. Data are publicly available at http://www.plant-phenotyping.org/datasets.Massimo Minervinimassimo.minervini@imtlucca.itAndreas FischbachHanno ScharrSotirios A. Tsaftaris2016-09-12T11:12:22Z2016-09-12T11:12:22Zhttp://eprints.imtlucca.it/id/eprint/3531This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35312016-09-12T11:12:22ZAnalysis of residual dependencies of independent components extracted from fMRI dataIndependent component analysis (ICA) of functional magnetic resonance imaging (fMRI) data can be employed as an exploratory method. The lack in the ICA model of strong a priori assumptions about the signal or about the noise leads to difficult interpretations of the results. Moreover, the statistical independence of the components is only approximated. Residual dependencies among the components can reveal informative structure in the data. A major problem is related to model order selection, that is, the number of components to be extracted. Specifically, overestimation may lead to component splitting. In this work, a method based on hierarchical clustering of ICA applied to fMRI datasets is investigated. The clustering algorithm uses a metric based on the mutual information between the ICs. To estimate the similarity measure, a histogram-based technique and one based on kernel density estimation are tested on simulated datasets. Simulations results indicate that the method could be used to cluster components related to the same task and resulting from a splitting process occurring at different model orders. Different performances of the similarity measures were found and discussed. Preliminary results on real data are reported and show that the method can group task related and transiently task related components.Nicola VanelloEmiliano Ricciardiemiliano.ricciardi@imtlucca.itLuigi Landini2016-09-12T11:00:16Z2017-08-04T10:17:42Zhttp://eprints.imtlucca.it/id/eprint/3530This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35302016-09-12T11:00:16ZCongenital blindness affects diencephalic but not mesencephalic structures in the human brainWhile there is ample evidence that the structure and function of visual cortical areas are affected by early visual deprivation, little is known of how early blindness modifies subcortical relay and association thalamic nuclei, as well as mesencephalic structures. Therefore, in the present multicenter study, we used MRI to measure volume of the superior and inferior colliculi, as well as of the thalamic nuclei relaying sensory and motor information to the neocortex, parcellated according to atlas-based thalamo-cortical connections, in 29 individuals with congenital blindness of peripheral origin (17 M, age 35.7 ± 14.3 years) and 29 sighted subjects (17 M, age 31.9 ± 9.0). Blind participants showed an overall volume reduction in the left (p = 0.008) and right (p = 0.007) thalami, as compared to the sighted individuals. Specifically, the lateral geniculate (i.e., primary visual thalamic relay nucleus) was 40 % reduced (left: p = 4 × 10−6, right: p < 1 × 10−6), consistent with findings from animal studies. In addition, associated thalamic nuclei that project to temporal (left: p = 0.005, right: p = 0.005), prefrontal (left: p = 0.010, right: p = 0.014), occipital (left: p = 0.005, right: p = 0.023), and right premotor (p = 0.024) cortical regions were also significantly reduced in the congenitally blind group. Conversely, volumes of the relay nuclei directly involved in auditory, motor, and somatosensory processing were not affected by visual deprivation. In contrast, no difference in volume was observed in either the superior or the inferior colliculus between the two groups. Our findings indicate that visual loss since birth leads to selective volumetric changes within diencephalic, but not mesencephalic, structures. Both changes in reciprocal cortico-thalamic connections or modifications in the intrinsic connectivity between relay and association nuclei of the thalamus may contribute to explain these alterations in thalamic volumes. Sparing of the superior colliculi is in line with their composite, multisensory projections, and with their not exclusive visual nature.Luca Cecchettiluca.cecchetti@imtlucca.itEmiliano Ricciardiemiliano.ricciardi@imtlucca.itGiacomo HandjarasRon KupersMaurice PtitoPietro Pietrinipietro.pietrini@imtlucca.it2016-09-08T06:40:19Z2016-09-08T06:40:19Zhttp://eprints.imtlucca.it/id/eprint/3526This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35262016-09-08T06:40:19ZStatic VS Dynamic Reversibility in CCSThe notion of reversible computing is attracting interest because of its applications in diverse fields, in particular the study of programming abstractions for fault tolerant systems. Reversible CCS (RCCS), proposed by Danos and Krivine, enacts reversibility by means of memory stacks. Ulidowski and Phillips proposed a general method to reverse a process calculus given in a particular SOS format, by exploiting the idea of making all the operators of a calculus static. CCSK is then derived from CCS with this method. In this paper we show that RCCS is at least as expressive as CCSK.Doriana MedicClaudio Antares Mezzinaclaudio.mezzina@imtlucca.it2016-09-05T08:12:54Z2016-09-05T08:12:54Zhttp://eprints.imtlucca.it/id/eprint/3525This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35252016-09-05T08:12:54ZLa conservazione nelle piccole e medie imprese di Genova: un percorso di censimentoGemma Torregemma.torre@imtlucca.it2016-09-05T08:10:47Z2016-09-05T08:10:47Zhttp://eprints.imtlucca.it/id/eprint/3524This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35242016-09-05T08:10:47Z(a cura di) Un archivio per l’impresa : problemi e prospettive di conservazioneGemma Torregemma.torre@imtlucca.it2016-08-31T08:47:26Z2016-08-31T08:47:26Zhttp://eprints.imtlucca.it/id/eprint/3523This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35232016-08-31T08:47:26ZPolarized User and Topic Tracking in TwitterDigital traces of conversations in micro-blogging platforms and OSNs provide information about user opinion with a high degree of resolution. These information sources can be exploited to understand and monitor collective behaviours. In this work, we focus on polarisation classes, i.e., those topics that require the user to side exclusively with one position. The proposed method provides an iterative classification of users and keywords: first, polarised users are identified, then polarised keywords are discovered by monitoring the activities of previously classified users. This method thus allows tracking users and topics over time. We report several experiments conducted on two Twitter datasets during political election time-frames. We measure the user classification accuracy on a golden set of users, and analyse the relevance of the extracted keywords for the ongoing political discussion.Mauro Colettomauro.coletto@imtlucca.itClaudio LuccheseSalvatore OrlandoRaffaele Perego2016-07-19T09:42:52Z2016-07-20T07:36:58Zhttp://eprints.imtlucca.it/id/eprint/3519This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35192016-07-19T09:42:52ZTweet-tales: moods of socio-economic crisis?The widespread adoption of highly interactive social media like Twitter, Facebook and other platforms allow users to communicate moods and opinions to their social network. Those platforms represent an unprecedented source of information about human habits and socio-economic interactions. Several new studies have started to exploit the potential of these big data as fingerprints of economic and social interactions.
The present analysis aims at exploring the informative power of indicators derived from social media activity, with the aim to trace some preliminary guidelines to investigate the eventual correspondence between social media indices and available labour market indicators at a territorial level. The study is based on a large dataset of about 4 million Italian-language tweets collected from October 2014 to December 2015, filtered by a set of specific keywords related to the labour market. With techniques from machine learning and user’s geolocalization, we were able to subset the tweets on specific topics in all Italian provinces. The corpus of tweets is then analyzed with linguistic tools and hierarchical clustering analysis. A comparison with traditional economic indicators suggests a strong need for further cleaning procedures, which are then developed in detail. As data from social networks are easy to obtain, this represents a very first attempt to evaluate their informative power in the Italian context, which is of potentially high importance in economic and social research.Grazia BiorciAntonella EminaMichelangelo Puligamichelangelo.puliga@imtlucca.itLisa SellaGianna Vivaldogianna.vivaldo@imtlucca.it2016-07-19T08:36:41Z2016-07-19T08:36:41Zhttp://eprints.imtlucca.it/id/eprint/3518This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35182016-07-19T08:36:41ZA continuous-time stochastic model for the mortality surface of multiple populationsWe formulate, study and calibrate a continuous-time model for
the joint evolution of the mortality surface of multiple populations. We model
the mortality intensity by age and population as a mixture of stochastic latent
factors, that can be either population-specific or common to all populations.
These factors are described by affine time-(in)homogenous stochastic processes.
Traditional, deterministic mortality laws can be extended to multi-population
stochastic counterparts within our framework. We detail the calibration pro-
cedure when factors are Gaussian, using centralized data-fusion Kalman filter.
We provide an application based on the mortality of UK males and females.
Although parsimonious, the specification we calibrate provides a good fit of
the observed mortality surface (ages 0-99) of both sexes between 1960 and
2013.Petar JevtićLuca Regisluca.regis@imtlucca.it2016-07-13T09:43:28Z2016-07-13T09:43:28Zhttp://eprints.imtlucca.it/id/eprint/3516This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35162016-07-13T09:43:28ZQuantitative Abstractions for Collective Adaptive SystemsCollective adaptive systems (CAS) consist of a large number of possibly heterogeneous entities evolving according to local interactions that may operate across multiple scales in time and space. The adaptation to changes in the environment, as well as the highly dispersed decision-making process, often leads to emergent behaviour that cannot be understood by simply analysing the objectives, properties, and dynamics of the individual entities in isolation.
As with most complex systems, modelling is a phase of crucial importance for the design of new CAS or the understanding of existing ones. Elsewhere in this volume the typical workflow of formal modelling, analysis, and evaluation of a CAS has been illustrated in detail. In this chapter we treat the problem of efficiently analysing large-scale CAS for quantitative properties. We review algorithms to automatically reduce the dimensionality of a CAS model preserving modeller-defined state variables, with focus on descriptions based on systems of ordinary differential equations. We illustrate the theory in a tutorial fashion, with running examples and a number of more substantial case studies ranging from crowd dynamics, epidemiology and biological systems.Andrea Vandinandrea.vandin@imtlucca.itMirco Tribastonemirco.tribastone@imtlucca.it2016-07-04T07:57:39Z2017-07-18T09:47:45Zhttp://eprints.imtlucca.it/id/eprint/3510This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35102016-07-04T07:57:39ZHow to build an identity for a cultural institution that inhabits the contemporaneity: SALTYesim Tonga Uriarteyesim.tonga@imtlucca.it2016-06-28T13:34:12Z2016-06-28T13:34:12Zhttp://eprints.imtlucca.it/id/eprint/3508This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35082016-06-28T13:34:12ZHow to estimate epidemic risk from incomplete contact
diaries data?Social interactions shape the patterns of spreading processes in a population. Techniques such as diaries or proximity sensors allow to collect data about encounters and to build networks of contacts
between individuals. The contact networks obtained from these different techniques are however quantitatively different. Here, we first show how these discrepancies affect the prediction of the
epidemic risk when these data are fed to numerical models of epidemic spread: low participation rate, under-reporting of contacts and overestimation of contact durations in contact diaries with
respect to sensor data determine indeed important differences in the outcomes of the corresponding simulations with for instance an enhanced sensitivity to initial conditions. Most importantly, we
investigate if and how information gathered from contact diaries can be used in such simulations in order to yield an accurate description of the epidemic risk, assuming that data from sensors represent the ground truth. The contact networks built from contact sensors and diaries present indeed several structural similarities: this suggests the possibility to construct, using only the contact diary network information, a surrogate contact network such that simulations using this surrogate network give the same estimation of the epidemic risk as simulations using the contact sensor network. We present and compare several methods to build such surrogate data, and show
that it is indeed possible to obtain a good agreement between the outcomes of simulations using surrogate and sensor data, as long as the contact diary information is complemented by publicly
available data describing the heterogeneity of the durations of human contacts.Rossana Mastrandrearossana.mastrandrea@imtlucca.itAlain Barrat2016-06-28T09:56:20Z2016-06-28T09:56:20Zhttp://eprints.imtlucca.it/id/eprint/3506This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35062016-06-28T09:56:20ZTuple Spaces Implementations and Their EfficiencyAmong the paradigms for parallel and distributed computing, the one popularized with Linda and based on tuple spaces is the least used one, despite the fact of being intuitive, easy to understand and to use. A tuple space is a repository of tuples, where process can add, withdraw or read tuples by means of atomic operations. Tuples may contain different values, and processes can inspect the content of a tuple via pattern matching. The lack of a reference implementations for this paradigm has prevented its widespread. In this paper, first we do an extensive analysis on what are the state of the art implementations and summarise their characteristics. Then we select three implementations of the tuple space paradigm and compare their performances on three different case studies that aim at stressing different aspects of computing such as communication, data manipulation, and cpu usage. After reasoning on strengths and weaknesses of the three implementations, we conclude with some recommendations for future work towards building an effective implementation of the tuple space paradigm.Vitaly BuravlevRocco De Nicolar.denicola@imtlucca.itClaudio Antares Mezzinaclaudio.mezzina@imtlucca.it2016-06-28T09:51:00Z2016-06-28T09:51:00Zhttp://eprints.imtlucca.it/id/eprint/3505This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35052016-06-28T09:51:00ZReversible Sessions Using MonitorsClaudio Antares Mezzinaclaudio.mezzina@imtlucca.itJorge A. Pérez2016-06-20T13:01:40Z2016-06-20T13:01:40Zhttp://eprints.imtlucca.it/id/eprint/3503This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35032016-06-20T13:01:40ZThe Importance of Understanding Contemporary Russia (preface)Riccardo M. Cucciollariccardo.cucciolla@imtlucca.it2016-06-20T12:58:56Z2016-06-20T12:58:56Zhttp://eprints.imtlucca.it/id/eprint/3502This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35022016-06-20T12:58:56Z(A cura di) The power state is back? The evolution of russian political thought after 1991Riccardo M. Cucciollariccardo.cucciolla@imtlucca.it2016-06-15T07:29:33Z2016-06-15T07:29:33Zhttp://eprints.imtlucca.it/id/eprint/3501This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35012016-06-15T07:29:33ZDistress propagation in complex networks: the case of non-linear DebtRankWe consider a dynamical model of distress propagation on complex networks, which we apply to the study of financial contagion in networks of banks connected to each other by direct exposures. The model that we consider is an extension of the DebtRank algorithm, recently introduced in the literature. The mechanics of distress propagation is very simple: When a bank suffers a loss, distress propagates to its creditors, who in turn suffer losses, and so on. The original DebtRank assumes that losses are propagated linearly between connected banks. Here we relax this assumption and introduce a one-parameter family of non-linear propagation functions. As a case study, we apply this algorithm to a data-set of 183 European banks, and we study how the stability of the system depends on the non-linearity parameter under different stress-test scenarios. We find that the system is characterized by a transition between a regime where small shocks can be amplified and a regime where shocks do not propagate, and that the overall the stability of the system increases between 2008 and 2013.Marco BardosciaFabio CaccioliJuan Ignacio Perottijuanignacio.perotti@imtlucca.itGianna Vivaldogianna.vivaldo@imtlucca.itGuido Caldarelliguido.caldarelli@imtlucca.it2016-06-15T07:24:15Z2016-09-08T15:24:10Zhttp://eprints.imtlucca.it/id/eprint/3500This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/35002016-06-15T07:24:15ZEconomic cycles and their synchronization: a
comparison of cyclic modes in three European countriesThe present work applies singular spectrum analysis (SSA) to the study of
macroeconomic
uctuations in three European countries: Italy, The Netherlands,
and the United Kingdom. This advanced spectral method provides valuable
spatial and frequency information for multivariate data sets and goes far
beyond the classical forms of time domain analysis. In particular, SSA enables
us to identify dominant cycles that characterize the deterministic behavior of
each time series separately, as well as their shared behavior. We demonstrate its
usefulness by analyzing several fundamental indicators of the three countries'
real aggregate economy in a univariate, as well as a multivariate setting. Since
business cycles are international phenomena, which show common characteristics
across countries, our aim is to uncover supranational behavior within the
set of representative European economies selected herein. Finally, the analysis
is extended to include several indicators from the U.S. economy, in order to
examine its in
uence on the European economies under study and their interrelationships.Lisa SellaGianna Vivaldogianna.vivaldo@imtlucca.itAndreas GrothMichael Ghil2016-06-15T07:23:45Z2016-06-15T07:23:45Zhttp://eprints.imtlucca.it/id/eprint/3499This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34992016-06-15T07:23:45ZNetworks of plants: how to measure similarity in vegetable speciesDespite the common misconception of nearly static organisms, plants do interact continuously with the environment and with each other. It is fair to assume that during their evolution they developed particular features to overcome similar problems and to exploit possibilities from environment. In this paper we introduce various quantitative measures based on recent advancements in complex network theory that allow to measure the effective similarities of various species. By using this approach on the similarity in fruit-typology ecological traits we obtain a clear plant classification in a way similar to traditional taxonomic classification. This result is not trivial, since a similar analysis done on the basis of diaspore morphological properties do not provide any clear parameter to classify plants species. Complex network theory can then be used in order to determine which feature amongst many can be used to distinguish scope and possibly evolution of plants. Future uses of this approach range from functional classification to quantitative determination of plant communities in nature.Gianna Vivaldogianna.vivaldo@imtlucca.itElisa MasiCamilla PandolfiStefano MancusoGuido Caldarelliguido.caldarelli@imtlucca.it2016-05-31T12:09:02Z2016-05-31T12:09:02Zhttp://eprints.imtlucca.it/id/eprint/3494This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34942016-05-31T12:09:02ZIntroduzione alla nozione di convergenza stabile e sue variantiIrene Crimaldiirene.crimaldi@imtlucca.it2016-05-23T08:03:01Z2016-05-23T08:03:01Zhttp://eprints.imtlucca.it/id/eprint/3487This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34872016-05-23T08:03:01ZOn the behaviour of deviant communities in online social networksOn-line social networks are complex ensembles of inter-linked communities that interact on different topics. Some communities are characterized by what are usually referred to as deviant behaviours, conducts that are commonly considered inappropriate with respect to the society’s norms or moral standards. Eating disorders, drug use, and adult content consumption are just a few examples. We refer to such communities as deviant networks. It is commonly believed that such deviant networks are niche, isolated social groups, whose activity is well separated from the mainstream social media life. According to this assumption, research studies have mostly considered them in isolation. In this work we focused on adult content consumption networks, which are present in many on-line social media and in the Web in general. We found that few small and densely connected communities are responsible for most of the content production. Differently from previous work, we studied how such communities interact with the whole social network. We found that the produced content flows to the rest of the network mostly directly or through bridge-communities, reaching at least 450 times more users.We also show that a large fraction of the users can be inadvertently exposed to such content through indirect content resharing. We also discuss a demographic analysis of the producers and consumers networks. Finally, we show that it is easily possible to identify a few core users to radically uproot the diffusion process. We aim at setting the basis to study deviant communities in context.Mauro Colettomauro.coletto@imtlucca.itLuca Maria AielloClaudio LuccheseFabrizio Silvestri2016-05-20T11:37:46Z2016-06-30T12:58:47Zhttp://eprints.imtlucca.it/id/eprint/3486This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34862016-05-20T11:37:46ZA foraminiferal δ18O record covering the last 2,200 yearsThanks to the precise core dating and the high sedimentation rate of the drilling site (Gallipoli Terrace, Ionian Sea) we were able to measure a foraminiferal δ18O series covering the last 2,200 years with a time resolution shorter than 4 years. In order to support the quality of this data-set we link the δ18O values measured in the foraminifera shells to temperature and salinity measurements available for the last thirty years covered by the core. Moreover, we describe in detail the dating procedures based on the presence of volcanic markers along the core and on the measurement of 210Pb and 137Cs activity in the most recent sediment layers. The high time resolution allows for detecting a δ18O decennial-scale oscillation, together with centennial and multicentennial components. Due to the dependence of foraminiferal δ18O on environmental conditions, these oscillations can provide information about temperature and salinity variations in past millennia. The strategic location of the drilling area makes this record a unique tool for climate and oceanographic studies of the Central Mediterranean.Carla TariccoSilvia Maria AlessioSara RubinettiGianna Vivaldogianna.vivaldo@imtlucca.itSalvatore Mancuso2016-05-11T11:02:48Z2016-09-13T09:42:41Zhttp://eprints.imtlucca.it/id/eprint/3485This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34852016-05-11T11:02:48ZHuman and Robot Hands: Sensorimotor Synergies to Bridge the Gap Between Neuroscience and RoboticsThe control of the many degrees of freedom of the hand through functional modules (hand synergies) has been proposed as a potentially useful model to describe how the hand can maintain postures while being able to rapidly change its configuration to accomplish a wide range of tasks. However, whether and to what extent synergies are actually encoded in motor cortical areas is still debated. A direct encoding of hand synergies is suggested by electrophysiological studies in nonhuman primates, but the evidence in humans resulted, so far, partial and indirect. In this chapter, we review the organization of the brain network that controls hand posture in humans and present preliminary results of a functional Magnetic Resonance Imaging (fMRI) on the encoding of synergies at a cortical level to control hand posture in humans.Andrea LeoGiacomo HandjarasHamal MarinoMatteo BianchiPietro Pietrinipietro.pietrini@imtlucca.itEmiliano Ricciardiemiliano.ricciardi@imtlucca.it2016-05-11T10:39:38Z2016-05-11T10:39:38Zhttp://eprints.imtlucca.it/id/eprint/3484This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34842016-05-11T10:39:38ZThe Water Suitcase of Migrants: Assessing Virtual Water Fluxes Associated to Human MigrationDisentangling the relations between human migrations and water resources is relevant for food security and trade policy in water-scarce countries. It is commonly believed that human migrations are beneficial to the water endowments of origin countries for reducing the pressure on local resources. We show here that such belief is over-simplistic. We reframe the problem by considering the international food trade and the corresponding virtual water fluxes, which quantify the water used for the production of traded agricultural commodities. By means of robust analytical tools, we show that migrants strengthen the commercial links between countries, triggering trade fluxes caused by food consumption habits persisting after migration. Thus migrants significantly increase the virtual water fluxes and the use of water in the countries of origin. The flux ascribable to each migrant, i.e. the "water suitcase", is found to have increased from 321 m3/y in 1990 to 1367 m3/y in 2010. A comparison with the water footprint of individuals shows that where the water suitcase exceeds the water footprint of inhabitants, migrations turn out to be detrimental to the water endowments of origin countries, challenging the common perception that migrations tend to relieve the pressure on the local (water) resources of origin countries.Rodolfo MetuliniStefania TameaFrancesco LaioMassimo Riccabonimassimo.riccaboni@imtlucca.it2016-05-10T10:12:52Z2017-08-04T10:18:13Zhttp://eprints.imtlucca.it/id/eprint/3483This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34832016-05-10T10:12:52ZHow concepts are encoded in the human brain: A modality independent, category-based cortical organization of semantic knowledgeAbstract How conceptual knowledge is represented in the human brain remains to be determined. To address the differential role of low-level sensory-based and high-level abstract features in semantic processing, we combined behavioral studies of linguistic production and brain activity measures by functional magnetic resonance imaging in sighted and congenitally blind individuals while they performed a property-generation task with concrete nouns from eight categories, presented through visual and/or auditory modalities. Patterns of neural activity within a large semantic cortical network that comprised parahippocampal, lateral occipital, temporo-parieto-occipital and inferior parietal cortices correlated with linguistic production and were independent both from the modality of stimulus presentation (either visual or auditory) and the (lack of) visual experience. In contrast, selected modality-dependent differences were observed only when the analysis was limited to the individual regions within the semantic cortical network. We conclude that conceptual knowledge in the human brain relies on a distributed, modality-independent cortical representation that integrates the partial category and modality specific information retained at a regional level.Giacomo HandjarasEmiliano Ricciardiemiliano.ricciardi@imtlucca.itAndrea LeoAlessandro LenciLuca Cecchettiluca.cecchetti@imtlucca.itMirco CosottiniGiovanna MarottaPietro Pietrinipietro.pietrini@imtlucca.it2016-05-10T09:59:13Z2016-05-10T09:59:13Zhttp://eprints.imtlucca.it/id/eprint/3482This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34822016-05-10T09:59:13ZLeaf segmentation in plant phenotyping: a collation studyImage-based plant phenotyping is a growing application area of computer vision in agriculture. A key task is the segmentation of all individual leaves in images. Here we focus on the most common rosette model plants, Arabidopsis and young tobacco. Although leaves do share appearance and shape characteristics, the presence of occlusions and variability in leaf shape and pose, as well as imaging conditions, render this problem challenging. The aim of this paper is to compare several leaf segmentation solutions on a unique and first-of-its-kind dataset containing images from typical phenotyping experiments. In particular, we report and discuss methods and findings of a collection of submissions for the first Leaf Segmentation Challenge of the Computer Vision Problems in Plant Phenotyping workshop in 2014. Four methods are presented: three segment leaves by processing the distance transform in an unsupervised fashion, and the other via optimal template selection and Chamfer matching. Overall, we find that although separating plant from background can be accomplished with satisfactory accuracy ( $$>$$ > 90 % Dice score), individual leaf segmentation and counting remain challenging when leaves overlap. Additionally, accuracy is lower for younger leaves. We find also that variability in datasets does affect outcomes. Our findings motivate further investigations and development of specialized algorithms for this particular application, and that challenges of this form are ideally suited for advancing the state of the art. Data are publicly available (online at http://www.plant-phenotyping.org/datasets ) to support future challenges beyond segmentation within this application domain.Hanno ScharrMassimo Minervinimassimo.minervini@imtlucca.itAndrew P. FrenchChristian KlukasDavid M. KramerXiaoming LiuImanol LuengoJean-Michel PapeGerrit PolderDanijela VukadinovicXi YinSotirios A. Tsaftarissotirios.tsaftaris@imtlucca.it2016-04-28T07:52:35Z2016-04-28T07:52:35Zhttp://eprints.imtlucca.it/id/eprint/3475This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34752016-04-28T07:52:35ZA Network Model characterized by a Latent Attribute Structure with CompetitionThe quest for a model that is able to explain, describe, analyze and simulate real-world complex networks is of uttermost practical, as well as theoretical, interest. In fact, networks can be a natural way to represent many phenomena; often, they arise from a complex interweaving of some features of the nodes. For example, in a co-authorship network, a link stems more easily between authors with similar interests; similarly, in a genetic regulatory network, links are affected by the different biological functions of the regulators.
In this paper we introduce and study a novel network model that is based on a latent attribute structure: this model, inspired by a generalization of the Indian Buffet process, is simple and contains a small number of parameters, with a clear and intuitive role. Each node is characterized by a number of features and the probability of the existence of an edge between two nodes depends on the features they share; the number of possible features is not fixed a priori and can grow indefinitely. Moreover, a random fitness parameter is introduced for each node in order to determine its ability to transmit its own features to other nodes; this behavior is added on top of a process of Indian-Buffet type. Because of the fitness property, a node’s connectivity does not depend on its age alone, so that also
“young but fit” nodes are able to compete and succeed in propagating their features and acquiring links. We also show how, considering the resulting bipartite node-attribute network, it is possible to gain some insight about which nodes were originally the most “fit”.
Our model for this bipartite network depends on few parameters, that are characterized by their straightforward interpretation and by the availability of proper estimators. Even if the parameters are easy to interpret and tune, the model is general enough to represent
complex phenomena—e.g., homophily, heterophily, or any interplay between features. We provide some theoretical as well as experimental results regarding the power-law behavior of the model and the proposed tools for the estimation of the parameters. We also show, through a number of experiments, how the proposed model naturally captures most local and global properties (e.g., degree distributions, connectivity and distance distributions)
real networks exhibit.Paolo Boldiboldi@di.unimi.itIrene Crimaldiirene.crimaldi@imtlucca.itCorrado Monticorrado.monti@unimi.it2016-04-27T07:56:32Z2016-04-27T07:56:32Zhttp://eprints.imtlucca.it/id/eprint/3474This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34742016-04-27T07:56:32ZA probabilistic interpretation of set-membership filtering: application to polynomial systems through polytopic boundingSet-membership estimation is usually formulated in the context of set-valued calculus and no probabilistic calculations are necessary. In this paper, we show that set-membership estimation can be equivalently formulated in the probabilistic setting by employing sets of probability measures. Inference in set-membership estimation is thus carried out by computing expectations with respect to the updated set of probability measures PP as in the probabilistic case. In particular, it is shown that inference can be performed by solving a particular semi-infinite linear programming problem, which is a special case of the truncated moment problem in which only the zeroth order moment is known (i.e., the support). By writing the dual of the above semi-infinite linear programming problem, it is shown that, if the nonlinearities in the measurement and process equations are polynomial and if the bounding sets for initial state, process and measurement noises are described by polynomial inequalities, then an approximation of this semi-infinite linear programming problem can efficiently be obtained by using the theory of sum-of-squares polynomial optimization. We then derive a smart greedy procedure to compute a polytopic outer-approximation of the true membership-set, by computing the minimum-volume polytope that outer-bounds the set that includes all the means computed with respect to P.Alessio BenavoliDario Pigadario.piga@imtlucca.it2016-04-27T07:52:54Z2016-04-27T07:52:54Zhttp://eprints.imtlucca.it/id/eprint/3472This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34722016-04-27T07:52:54ZSparse optimization for automated energy end use disaggregationRetrieving the household electricity consumption at individual appliance level is an essential requirement to assess the contribution of different end uses to the total household consumption, and thus to design energy saving policies and user-tailored feedback for reducing household electricity usage. This has led to the development of nonintrusive appliance load monitoring (NIALM), or energy disaggregation, algorithms, which aim to decompose the aggregate energy consumption data collected from a single measurement point into device-level consumption estimations. Existing NIALM algorithms are able to provide accurate estimate of the fraction of energy consumed by each appliance. Yet, in the authors' experience, they provide poor performance in reconstructing the power consumption trajectories overtime. In this brief, a new NIALM algorithm is presented, which, besides providing very accurate estimates of the aggregated consumption by appliance, also accurately characterizes the appliance power consumption profiles overtime. The proposed algorithm is based on the assumption that the unknown appliance power consumption profiles are piecewise constant overtime (as it is typical for power use patterns of household appliances) and it exploits the information on the time-of-day probability in which a specific appliance might be used. The disaggregation problem is formulated as a least-square error minimization problem, with an additional (convex) penalty term aiming at enforcing the disaggregate signals to be piecewise constant overtime. Testing on household electricity data available in the literature is reported.Dario Pigadario.piga@imtlucca.itAndrea CominolaMatteo GiulianiAndrea CastellettiAndrea Emilio Rizzoli2016-04-19T08:08:19Z2016-04-19T08:08:19Zhttp://eprints.imtlucca.it/id/eprint/3457This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34572016-04-19T08:08:19ZNode-to-segment and node-to-surface interface finite elements for fracture mechanicsThe topologies of existing interface elements used to discretize cohesive cracks are such that they can be used to compute the relative displacements (displacement discontinuities) of two opposing segments (in 2D) or of two opposing facets (in 3D) belonging to the opposite crack faces and enforce the cohesive traction–separation relation. In the present work we propose a novel type of interface element for fracture mechanics sharing some analogies with the node-to-segment (in 2D) and with the node-to-surface (in
3D) contact elements. The displacement gap of a node belonging to the finite element discretization of one crack face with respect to its projected point on the opposite face is used to determine the cohesive tractions, the residual vector and its consistent linearization for an implicit solution scheme. The following advantages with respect to classical interface finite elements are demonstrated: (i) non-matching finite element discretizations of the opposite crack faces is possible; (ii) easy modeling of cohesive cracks with non-propagating crack tips; (iii) the internal rotational equilibrium of the interface element is assured. Detailed examples are provided to show the usefulness of the proposed approach in nonlinear fracture mechanics problems.Marco Paggimarco.paggi@imtlucca.itPeter Wriggers2016-04-19T08:06:52Z2016-04-19T08:06:52Zhttp://eprints.imtlucca.it/id/eprint/3456This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34562016-04-19T08:06:52ZFrom NASGRO to fractals: Representing crack growth in metalsThis paper presents the results of an extensive experimental analysis of the fractal properties of fatigue crack rough surfaces. The analysis of the power-spectral density functions of profilometric traces shows a predominance of the box fractal dimension D = 1.2. This result leads to a particularization of the fatigue crack growth equation based on fractality proposed by the last two authors which is very close to the generalized Frost–Dugdale equation proposed by the first three authors. The two approaches, albeit based
on different initial modelling assumptions, are both very effective in predicting the crack growth rate of short cracks.R. JonesF. ChenS. PittMarco Paggimarco.paggi@imtlucca.itAlberto Carpinteri2016-04-19T08:05:08Z2017-09-26T09:15:12Zhttp://eprints.imtlucca.it/id/eprint/3383This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/33832016-04-19T08:05:08ZImages, Invisibility, and Motion: Brief Essay on Chronophotography, Cinema, and Optical UnconsciousThis paper is based on an analysis of the notion of “Optical Unconscious” by Walter Benjamin.
It seeks to present an interpretation of this notion in connection with the historical relationship between the birth of cinematic technology on the one hand and, and research conducted during the same period in the field of physiology investigating human and animal movement on the other hand.
To this end I will analyze the following three concepts: (a) alienation, (b) automatism and (c) invisibility.
(a) In Minutiae, Close-up, Microanalysis, Carlo Ginzburg formulates an analogy to describe the “optical unconscious” and juxtaposes it with a page from Marcel Proust in which the alien gaze of the narrator parallels the imperturbable lens of a camera and he experiences the physiognomy of the objects in their anonymous being.
Through reference to this passage, I seek to prove that the meaning of the photographic image does not reside in its ability to reflect its object as something real and familiar, but rather in its ability to alienate this object and make it foreign to the observer.
(b) This paper will link this impersonality of the subject to the concept of automatism and analyze this link through William K.L. Dickson’s Kinetoscopic Record of a Sneeze (1894) as an example of the many images of the time that depicted an involuntary movement on the part of the represented subject, namely an action or series of actions beyond the subject’s control.
I assert that this idea of displaying the ordinariness of an involuntary action constitutes a specificity that both photographic and cinematographic technology are based on.
(c) The comprehensive meaning of the represented subject therefore depends on the device, otherwise it would have been doomed to invisibility. In order to clarify what kind of invisibility is at stake here, my study will take a step back to examine the historical origins of the photo-cinematographic tools used in experimental physiology and the role that representation came to have (the idea of the autonomy of the representation).
Finally, in order to clarify these hypotheses, I shall analyze the case of the French physiologist Étienne-Jules Marey.
By studying physiological theories on motion at the end of XIX Century, this paper will bring the relationship between photography and cinema back to its historical origins and highlight moments of intersection.Linda Bertellilinda.bertelli@imtlucca.it2016-04-19T07:48:41Z2016-04-19T09:06:15Zhttp://eprints.imtlucca.it/id/eprint/3346This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/33462016-04-19T07:48:41ZReversibility in the higher-order \(π\)-calculusThe notion of reversible computation is attracting increasing interest because of its applications in diverse fields, in particular the study of programming abstractions for reliable systems. In this paper, we continue the study undertaken by Danos and Krivine on reversible CCS by defining a reversible higher-order π -calculus, called rhoπ. We prove that reversibility in our calculus is causally consistent and that the causal information used to support reversibility in rhoπ is consistent with the one used in the causal semantics of the π -calculus developed by Boreale and Sangiorgi. Finally, we show that one can faithfully encode rhoπ into a variant of higher-order π, substantially improving on the result we obtained in the conference version of this paper.Ivan LaneseClaudio Antares Mezzinaclaudio.mezzina@imtlucca.itJean-Bernard Stefani2016-04-13T08:40:24Z2016-04-13T08:40:24Zhttp://eprints.imtlucca.it/id/eprint/3437This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34372016-04-13T08:40:24ZEfficient Syntax-Driven Lumping of Differential EquationsWe present an algorithm to compute exact aggregations of a class of systems of ordinary differential equations (ODEs). Our approach consists in an extension of Paige and Tarjan’s seminal solution to the coarsest refinement problem by encoding an ODE system into a suitable discrete-state representation. In particular, we consider a simple extension of the syntax of elementary chemical reaction networks because (i) it can express ODEs with derivatives given by polynomials of degree at most two, which are relevant in many applications in natural sciences and engineering; and (ii) we can build on two recently introduced bisimulations, which yield two complementary notions of ODE lumping. Our algorithm computes the largest bisimulations in O(r⋅s⋅logs)O(r⋅s⋅logs) time, where r is the number of monomials and s is the number of variables in the ODEs. Numerical experiments on real-world models from biochemistry, electrical engineering, and structural mechanics show that our prototype is able to handle ODEs with millions of variables and monomials, providing significant model reductions.Luca CardelliMirco Tribastonemirco.tribastone@imtlucca.itMax Tschaikowskimax.tschaikowski@imtlucca.itAndrea Vandinandrea.vandin@imtlucca.it2016-04-13T08:31:40Z2016-04-13T08:31:40Zhttp://eprints.imtlucca.it/id/eprint/3436This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34362016-04-13T08:31:40ZNoise Reduction in Complex Biological SwitchesCells operate in noisy molecular environments via complex regulatory networks. It is possible to understand how molecular counts are related to noise in specific networks, but it is not generally clear how noise relates to network complexity, because different levels of complexity also imply different overall number of molecules. For a fixed function, does increased network complexity reduce noise, beyond the mere increase of overall molecular counts? If so, complexity could provide an advantage counteracting the costs involved in maintaining larger networks. For that purpose, we investigate how noise affects multistable systems, where a small amount of noise could lead to very different outcomes; thus we turn to biochemical switches. Our method for comparing networks of different structure and complexity is to place them in conditions where they produce exactly the same deterministic function. We are then in a good position to compare their noise characteristics relatively to their identical deterministic traces. We show that more complex networks are better at coping with both intrinsic and extrinsic noise. Intrinsic noise tends to decrease with complexity, and extrinsic noise tends to have less impact. Our findings suggest a new role for increased complexity in biological networks, at parity of function.Luca CardelliAttila Csikász-NagyNeil DalchauMirco Tribastonemirco.tribastone@imtlucca.itMax Tschaikowskimax.tschaikowski@imtlucca.it2016-04-13T08:26:32Z2016-04-13T08:26:32Zhttp://eprints.imtlucca.it/id/eprint/3435This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34352016-04-13T08:26:32ZSymbolic Computation of Differential EquivalencesOrdinary differential equations (ODEs) are widespread in manynatural sciences including chemistry, ecology, and systems biology,and in disciplines such as control theory and electrical engineering. Building on the celebrated molecules-as-processes paradigm, they have become increasingly popular in computer science, with high-level languages and formal methods such as Petri nets, process algebra, and rule-based systems that are interpreted as ODEs. We consider the problem of comparing and minimizing ODEs automatically. Influenced by traditional approaches in the theory of programming, we propose differential equivalence relations. We study them for a basic intermediate language, for which we have decidability results, that can be targeted by a class of high-level specifications. An ODE implicitly represents an uncountable state space, hence reasoning techniques cannot be borrowed from established domains such as probabilistic programs with finite-state Markov chain semantics. We provide novel symbolic procedures to check an equivalence and compute the largest one via partition refinement algorithms that use satisfiability modulo theories. We illustrate the generality of our framework by showing that differential equivalences include (i) well-known notions for the minimization of continuous-time Markov chains (lumpability),(ii) bisimulations for chemical reaction networks recently proposedby Cardelli et al., and (iii) behavioral relations for process algebra with ODE semantics. With a prototype implementation we are able to detect equivalences in biochemical models from the literature thatcannot be reduced using competing automatic techniques.Luca CardelliMirco Tribastonemirco.tribastone@imtlucca.itMax Tschaikowskimax.tschaikowski@imtlucca.itAndrea Vandinandrea.vandin@imtlucca.it2016-04-13T08:14:47Z2016-04-13T08:14:47Zhttp://eprints.imtlucca.it/id/eprint/3433This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/34332016-04-13T08:14:47ZComparing Chemical Reaction Networks:A Categorical and Algorithmic PerspectiveWe study chemical reaction networks (CRNs) as a kernel language for concurrency models with semantics based on ordinary differential equations. We investigate the problem of comparing two CRNs,i.e., to decide whether the trajectories of asource CRN can be matched by a target CRN under an appropriate choice of initial conditions. Using a categorical framework, we extend and relate model-comparison approaches based on structural (syntactic) and on dynamical (semantic) properties of a CRN, proving their equivalence. Then, we provide an algorithm to compare CRNs, running linearly in time with respect to the cardinality of all possible comparisons. Finally, we apply our results to biological models from the literature.Luca CardelliMirco Tribastonemirco.tribastone@imtlucca.itMax Tschaikowskimax.tschaikowski@imtlucca.itAndrea Vandinandrea.vandin@imtlucca.it2016-04-04T09:07:50Z2016-04-04T09:07:50Zhttp://eprints.imtlucca.it/id/eprint/3350This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/33502016-04-04T09:07:50ZA thermo-visco-elastic shear-lag model for the prediction of residual stresses in photovoltaic modules after laminationAbstract The distribution of residual thermo-elastic stresses in encapsulated solar cells arising from lamination is relevant for the characterization of the long term performance of photovoltaic (PV) modules during service. Accurate modelling of the structural response of the laminate in the transient regime during cooling after lamination is a challenging task from the computational point of view. In this work we propose a semi-analytic model based on the Kirchhoff plate theory and the shear-lag approach for the treatment of the polymeric encapsulant layers and accounting for their time and temperature dependency according to a rheological model derived from fractional calculus considerations. Spatially uniform and non-uniform temperature distributions are compared to accurately assess the amount of the residual compressive stresses raised in the Silicon cells after lamination. The use of more realistic non-uniform temperature distributions leads to lower residual compressive stresses in Silicon as compared to the uniform case.Saheed Olalekan Ojosaheed.ojo@imtlucca.itMarco Paggimarco.paggi@imtlucca.it2016-03-21T09:02:41Z2016-03-21T09:19:23Zhttp://eprints.imtlucca.it/id/eprint/3243This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/32432016-03-21T09:02:41ZA poly(ether-ester) copolymer for the preparation of nanocarriers with improved degradation and drug delivery kineticsAbstract This paper reports the synthesis and the physicochemical, functional and biological characterisations of nanocarriers made of a novel di-block biodegradable poly(ether-ester) copolymer. This material presents tunable, fast biodegradation rates, but its products are less acidic than those of other biosorbable polymers like PLGA, thus presenting a better biocompatibility profile and the possibility to carry pH-sensitive payloads. A method for the production of monodisperse and spherical nanoparticles is proposed; drug delivery kinetics and blood protein adsorption were measured to evaluate the functional properties of these nanoparticles as drug carriers. The copolymer was labelled with a fluorescent dye for internalisation tests, and rhodamine B was used as a model cargo to study transport and release inside cultured cells. Biological tests demonstrated good cytocompatibility, significant cell internalisation and the possibility to vehiculate non-cell penetrating moieties into endothelial cells. Taken together, these results support the potential use of this nanoparticulate system for systemic administration of drugs.Mariacristina Gagliardimariacristina.gagliardi@imtlucca.itAlice BerteroGiuseppe BardiAngelo Bifone2016-03-21T08:42:57Z2017-09-26T09:15:52Zhttp://eprints.imtlucca.it/id/eprint/3242This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/32422016-03-21T08:42:57ZTowards an Economy of the BodyThis paper focuses on three main cases: Étienne-Jules Marey (1830-1904), Georges Demeny (1850-1917) and the Gilbreths (Frank B. Gilbreth, 1868-1924 and Lillian M. Gilbreth, 1878-1972), and their respective studies of movement.
More specifically, it investigates Marey’s and Demeny’s experiments with fix plate chronophotography (1883-1886) focused on human locomotion, Demeny’s research with chronophotography on sensitive strip and celluloid film studying gymnastics (1888-1892), and Frank and Lillian Gilbreth’s work on the cyclograph (as expounded in Fatigue Study and Applied Motion Study).
Firstly, the paper analyzes the three different procedural protocols of these experiments in order to identify their similarities and differences and to understand what, if any, experimental model they give rise to. It scrutinizes in particular: a) the position of the scientist’s body in the experimental field and its role in theory (the training of the observer’s/scientist’s body); b) the preparation of the body of the subjects to be analyzed (before the experiments) and the way these bodies were posed in the experimental field (during the experiments); c) the status of the camera’s mechanical body.
Secondly and finally, the paper aims to show how all these regulatory norms serve and enable a certain economy of the body in two interconnected senses: a) economy as a form of reduction. The paper analyzes different ways the body inside the experimental field is isolated/deleted depending on whether it is the scientist’s body, the subject’s body under analysis or the body of the camera; b) economy as a system of efficiency. Beginning from M. Mauss’s notion of “techniques du corps” as a general theoretical framework as well as specific examples of disciplining effects on individuals (See Phéline, Christian. L'image accusatrice. Laplume, France: Association de critique contemporaine en photographie, 1985), this paper seeks to outline the historical role played by the abovementioned studies of the body in developing efficiency (and its relationship with work) as an object of knowledge.Linda Bertellilinda.bertelli@imtlucca.it2016-03-21T08:41:12Z2016-03-21T08:41:12Zhttp://eprints.imtlucca.it/id/eprint/3240This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/32402016-03-21T08:41:12ZParcellation-based connectome assessment by using structural and functional connectivityConnectome analysis of the human brain structural and functional architecture provides a unique opportunity to understand the organization of brain networks. In this work, we investigate a novel large scale parcellation-based connectome, merging together information coming from resting state fMRI (rs-fMRI) data and diffusion tensor imaging (DTI) measurements.Ying-Chia Linyingchia.lin@imtlucca.itTommaso GiliSotirios A. TsaftarisAndrea GabrielliMariangela IorioGianfranco SpallettaGuido Caldarelliguido.caldarelli@imtlucca.it2016-03-21T08:41:04Z2016-03-21T08:41:04Zhttp://eprints.imtlucca.it/id/eprint/3239This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/32392016-03-21T08:41:04ZA cortical and sub-cortical parcellation clustering by intrinsic functional connectivityNetwork analysis of resting-state fMRI (rsfMRI) has been widely utilized to investigate the functional architecture of the whole brain. Here we propose a robust parcellation method that first divides cortical and sub-cortical regions into sub-regions by clustering the rsfMRI data for each subject independently, and then merges those individual parcellations to obtain a global whole brain parcellation. To do so our method relies on majority voting (to merge parcellations of multiple subjects) and enforces spatial constraints within a hierarchical agglomerative clustering framework to define parcels that are spatially homogeneous.Ying-Chia Linyingchia.lin@imtlucca.itTommaso GiliSotirios A. TsaftarisAndrea GabrielliMariangela IorioGianfranco SpallettaGuido Caldarelliguido.caldarelli@imtlucca.it2016-03-17T10:50:37Z2016-05-04T10:13:13Zhttp://eprints.imtlucca.it/id/eprint/3241This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/32412016-03-17T10:50:37ZMilitary Conflict and the Rise of Urban EuropeWe present new evidence about the relationship between military conflict and city
population growth in Europe from the fall of Charlemagne’s empire to the start of
the Industrial Revolution. Military conflict was a main feature of European history.
We argue that cities were safe harbors from conflict threats. To test this argument,
we construct a novel database that geocodes the locations of more than 800 conflicts
between 800 and 1799. We find a significant, positive, and robust relationship that
runs from conflict exposure to city population growth. Our analysis suggests that
military conflict played a key role in the rise of urban Europe.Mark DinceccoMassimiliano Gaetano Onoratomassimiliano.onorato@imtlucca.it2016-03-11T12:10:25Z2016-05-04T09:55:02Zhttp://eprints.imtlucca.it/id/eprint/3214This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/32142016-03-11T12:10:25ZConservation integrals for two circular holes kept at different temperatures in a thermoelastic solidAbstract An explicit analytic solution for thermal stresses in an infinite thermoelastic medium with two circular cylindrical holes of different sizes kept at different constant temperatures, under steady-state heat flux is presented. The solution is obtained by using the most general representation of a biharmonic function in bipolar coordinates. The stress field is decomposed into the sum of a particular stress field induced by the steady-state temperature distribution and an auxiliary isothermal stress field required to satisfy the boundary conditions on the holes. The variations of the stress concentration factor on the surface of the holes are determined for varying geometry of the holes. The concept of the conservation integrals Jk, M and L is extended to steady state thermoelasticity and the integrals are proved to be path-independent. These integrals are calculated on closed contours encircling one or both holes. The geometries of a hole in a half-space and an eccentric annular cylinder are considered as particular cases.Enrico RadiLorenzo Morinilorenzo.morini@imtlucca.itI. Sevostianov2016-03-11T12:02:01Z2016-05-04T09:53:37Zhttp://eprints.imtlucca.it/id/eprint/3212This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/32122016-03-11T12:02:01ZMultiscale asymptotic homogenization analysis of thermo-diffusive composite materialsIn this paper an asymptotic homogenization method for the analysis of composite materials with periodic microstructure in presence of thermodiffusion is described. Appropriate down-scaling relations correlating the microscopic fields to the macroscopic displacements, temperature and chemical potential are introduced. The effects of the material inhomogeneities are described by perturbation functions derived from the solution of recursive cell problems. Exact expressions for the overall elastic and thermodiffusive constants of the equivalent first order thermodiffusive continuum are derived. The proposed approach is applied to the case of a two-dimensional bi-phase orthotropic layered material, where the effective elastic and thermodiffusive properties can be determined analytically. Considering this illustrative example and assuming periodic body forces, heat and mass sources acting on the medium, the solution performed by the first order homogenization approach is compared with the numerical results obtained by the heterogeneous model.Andrea Bacigalupoandrea.bacigalupo@imtlucca.itLorenzo Morinilorenzo.morini@imtlucca.itAmdrea Piccolroaz2016-03-08T11:44:02Z2016-09-14T10:21:16Zhttp://eprints.imtlucca.it/id/eprint/3195This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31952016-03-08T11:44:02ZThe global administrative law scholarshipLorenzo Casinilorenzo.casini@imtlucca.it2016-03-08T11:29:28Z2016-09-14T10:21:16Zhttp://eprints.imtlucca.it/id/eprint/3194This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31942016-03-08T11:29:28ZThe expansion of the material scope of global lawLorenzo Casinilorenzo.casini@imtlucca.it2016-03-01T10:32:04Z2016-09-12T08:28:51Zhttp://eprints.imtlucca.it/id/eprint/3170This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31702016-03-01T10:32:04ZA synergy-based hand control is encoded in human motor cortical areasHow the human brain controls hand movements to carry out different tasks is still debated. The concept of synergy has been proposed to indicate functional modules that may simplify the control of hand postures by simultaneously recruiting sets of muscles and joints. However, whether and to what extent synergic hand postures are encoded as such at a cortical level remains unknown. Here, we combined kinematic, electromyography, and brain activity measures obtained by functional magnetic resonance imaging while subjects performed a variety of movements towards virtual objects. Hand postural information, encoded through kinematic synergies, were represented in cortical areas devoted to hand motor control and successfully discriminated individual grasping movements, significantly outperforming alternative somatotopic or muscle-based models. Importantly, hand postural synergies were predicted by neural activation patterns within primary motor cortex. These findings support a novel cortical organization for hand movement control and open potential applications for brain-computer interfaces and neuroprostheses.Andrea LeoGiacomo HandjarasMatteo BianchiHamal MarinoMarco GabicciniAndrea GuidiEnzo Pasquale ScilingoPietro Pietrinipietro.pietrini@imtlucca.itAntonio BicchiMarco SantelloEmiliano Ricciardiemiliano.ricciardi@imtlucca.it2016-02-29T10:02:43Z2016-02-29T10:02:43Zhttp://eprints.imtlucca.it/id/eprint/3159This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31592016-02-29T10:02:43ZAn extreme value analysis of the last century crises
across industries in the U.S. economyThe two large scale crises that hit the world economy in the last century, i.e. the Great
Depression and the Great Recession, have similar outbreak and recovery patterns with
respect to several macroeconomic variables. In particular, the largest depressions are
likely to be accompanied by stock-market crashes. This study investigates the behavior
of the U.S. stock market before, during and after deep downturns, focusing particularly
on the tails of the return distribution. We develop two automatic procedures to identify
multiple change-points in the tail of financial time series as well as in the co-crash and
co-boom probabilities of different markets. We then apply our methodology to twelve
time series representative of the sectors of the U.S. economy. We find that regime
shifts in the lower tail of the distribution tend to co-occur before deep downturns. Our
results contribute to a better understanding of the origin and systemic nature of large
scale events to make policy interventions more timely and effective.Marco BeeMassimo Riccabonimassimo.riccaboni@imtlucca.itLuca Trapinluca.trapin@imtlucca.it2016-02-29T09:01:28Z2016-03-01T10:41:59Zhttp://eprints.imtlucca.it/id/eprint/3147This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31472016-02-29T09:01:28ZOn the approximation of the optimal control functions in
stochastic optimal control problemsGiorgio Gneccogiorgio.gnecco@imtlucca.itMarcello Sanguineti2016-02-29T09:01:04Z2016-03-04T08:25:15Zhttp://eprints.imtlucca.it/id/eprint/3148This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31482016-02-29T09:01:04ZSymmetry and antisymmetry properties of optimal solutions to regression problemsGiorgio Gneccogiorgio.gnecco@imtlucca.it2016-02-29T08:59:31Z2016-02-29T08:59:31Zhttp://eprints.imtlucca.it/id/eprint/3151This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31512016-02-29T08:59:31ZDirect learning ofLPVcontrollers from dataIn many control applications, it is attractive to describe nonlinear (NL) and time-varying (TV) plants by linear parametervarying (LPV) models and design controllers based on such representations to regulate the behaviour of the system. The LPV system class offers the representation of NL and TV phenomena as a linear dynamic relationship between input and output signals, which relationship is dependent on some measurable signals, e.g., operating conditions, often called as scheduling variables. For such models, powerful control synthesis tools are available, but the way how to systematically convert available first principles models to LPV descriptions of the plant, to efficiently identify LPV models for control from data and to understand how modeling errors affect the control performance are still subject of undergoing research. Therefore, it is attractive to synthesize the controller directly from data without the need of modeling the plant and addressing the underlying difficulties. Hence, in this paper, a novel data-driven synthesis scheme is proposed in a stochastic framework to provide a practically applicable solution for synthesizing LPV controllers directly from data. Both the cases of fixed order controller tuning and controller structure learning are discussed and two different design approaches are provided. The effectiveness of the proposed methods is also illustrated by means of an academic example and a real application based simulation case study.Simone FormentinDario Pigadario.piga@imtlucca.itRoland TóthSergio M. Savaresi2016-02-29T08:58:00Z2016-02-29T08:58:00Zhttp://eprints.imtlucca.it/id/eprint/3150This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31502016-02-29T08:58:00ZComputation of the Structured Singular Value via Moment LMI RelaxationsThe Structured Singular Value (SSV) provides a powerful tool to test robust stability and performance of feedback systems subject to structured uncertainties. Unfortunately, computing the SSV is an NP-hard problem, and the polynomial-time algorithms available in the literature are only able to provide, except for some special cases, upper and lower bounds on the exact value of the SSV. In this work, we present a new algorithm to compute an upper bound on the SSV in case of mixed real/complex uncertainties. The underlying idea of the developed approach is to formulate the SSV computation as a (nonconvex) polynomial optimization problem, which is relaxed into a sequence of convex optimization problems through moment-based relaxation techniques. Two heuristics to compute a lower bound on the SSV are also discussed. The analyzed numerical examples show that the developed approach provides tighter bounds than the ones computed by the algorithms implemented in the Robust Control Toolbox in Matlab, and it provides, in most of the cases, coincident lower and upper bounds on the structured singular value.Dario Pigadario.piga@imtlucca.it2016-02-26T15:48:35Z2016-03-04T08:26:07Zhttp://eprints.imtlucca.it/id/eprint/3146This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31462016-02-26T15:48:35ZA green policy to schedule tasks in a distributed cloudStefano Sebastiostefano.sebastio@imtlucca.itGiorgio Gneccogiorgio.gnecco@imtlucca.it2016-02-26T15:47:40Z2016-03-01T10:41:10Zhttp://eprints.imtlucca.it/id/eprint/3145This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31452016-02-26T15:47:40ZBinary and
multi-class Parkinsonian disorders classification using Support Vector Machines with
graph-based featuresRita Morisirita.morisi@imtlucca.itGiorgio Gneccogiorgio.gnecco@imtlucca.itNico LanconelliStefano ZanigniDavid Neil MannersClaudia TestaStefania EvangelistiLaura Ludovica GramegnaClaudio BianchiniPietro CortelliCaterina TononRaffaele Lodi2016-02-26T15:44:37Z2016-03-04T08:24:34Zhttp://eprints.imtlucca.it/id/eprint/3144This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31442016-02-26T15:44:37Z“Optimal distributed task scheduling in volunteer cloudsStefano Sebastiostefano.sebastio@imtlucca.itGiorgio Gneccogiorgio.gnecco@imtlucca.itAlberto Bemporadalberto.bemporad@imtlucca.it2016-02-26T15:43:29Z2016-03-04T08:25:32Zhttp://eprints.imtlucca.it/id/eprint/3143This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31432016-02-26T15:43:29ZTransboundary pollution control and environmental absorption efficiency managementF. El OuardighiK. KoganGiorgio Gneccogiorgio.gnecco@imtlucca.itMarcello Sanguineti2016-02-26T15:41:35Z2016-10-04T09:03:37Zhttp://eprints.imtlucca.it/id/eprint/3142This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31422016-02-26T15:41:35ZLinear Quadratic Gaussian (LQG) online learningOptimal control theory and machine learning techniques are combined to propose and solve in closed form an optimal control formulation of online learning from supervised examples. The connections with the classical Linear Quadratic Gaussian (LQG) optimal control problem, of which the proposed learning paradigm is a non trivial variation as it involves random matrices, are investigated. The obtained optimal solutions are compared with the Kalman-filter estimate of the parameter vector to be learned. It is shown that the former enjoys larger smoothness and robustness to outliers, thanks to the presence of a regularization term. The basic formulation of the proposed online-learning framework refers to a discrete time setting with a finite learning horizon and a linear model. Various extensions are investigated, including the infinite learning horizon and, via the so-called "kernel trick", the case of nonlinear models.
Subjects: Optimization and Control (math.OC)
Cite as: arXiv:1606.04272 [math.OC]
(or arXiv:1606.04272v2 [math.OC] for this version)Giorgio Gneccogiorgio.gnecco@imtlucca.itAlberto Bemporadalberto.bemporad@imtlucca.itMarco GoriMarcello Sanguineti2016-02-26T15:40:04Z2016-03-04T08:24:59Zhttp://eprints.imtlucca.it/id/eprint/3141This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31412016-02-26T15:40:04ZSymmetric and antisymmetric properties of solutions to kernel-based machine learning problemsGiorgio Gneccogiorgio.gnecco@imtlucca.it2016-02-26T15:38:43Z2016-03-01T10:42:48Zhttp://eprints.imtlucca.it/id/eprint/3140This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31402016-02-26T15:38:43ZWelfare effects of uniform and differential pricing schemes: an analysis through quadratic programmingGiorgio Gneccogiorgio.gnecco@imtlucca.itFabio Pammollif.pammolli@imtlucca.itBerna Tuncayberna.tuncay@imtlucca.it2016-02-26T15:34:36Z2016-03-04T08:25:49Zhttp://eprints.imtlucca.it/id/eprint/3139This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31392016-02-26T15:34:36ZA game-theoretic approach to parallel trade through the
price of anarchyGiorgio Gneccogiorgio.gnecco@imtlucca.itBerna Tuncayberna.tuncay@imtlucca.it2016-02-26T15:31:00Z2016-03-01T10:43:13Zhttp://eprints.imtlucca.it/id/eprint/3138This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31382016-02-26T15:31:00ZA proposal for within-cohort redistribution in a funded pension systemBenedetta FrassiGiorgio Gneccogiorgio.gnecco@imtlucca.itFabio Pammollif.pammolli@imtlucca.itXue Wenxue.wen@imtlucca.it2016-02-26T15:28:19Z2016-03-04T08:23:15Zhttp://eprints.imtlucca.it/id/eprint/3137This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31372016-02-26T15:28:19ZCongestion-aware forwarding strategies for intermittently connected networksMarco CelloGiorgio Gneccogiorgio.gnecco@imtlucca.itMario MarcheseMarcello Sanguineti2016-02-26T14:35:46Z2016-02-29T08:31:00Zhttp://eprints.imtlucca.it/id/eprint/3129This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31292016-02-26T14:35:46ZBinary and multi-class classification of parkinsonian disorders with support vector machines based on quantitative brain MR and graph-based featuresLaura Ludovica GramegnaClaudia TestaRita Morisirita.morisi@imtlucca.itStefano ZanigniGiorgio Gneccogiorgio.gnecco@imtlucca.itNico LanconelliDavid Neil MannersStefania EvangelistiPietro CortelliCaterina TononRaffaele Lodi2016-02-26T13:24:09Z2016-02-26T13:24:09Zhttp://eprints.imtlucca.it/id/eprint/3127This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31272016-02-26T13:24:09ZLearning with hard constraints as a limit case of learning with soft constraintsGiorgio Gneccogiorgio.gnecco@imtlucca.itMarco GoriStefano MelacciMarcello Sanguineti2016-02-26T12:39:16Z2016-02-26T12:39:16Zhttp://eprints.imtlucca.it/id/eprint/3123This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31232016-02-26T12:39:16ZOn the Curse of Dimensionality in the Ritz MethodIt is shown that the classical Ritz method of the calculus of variations suffers from the “curse of dimensionality,” i.e., an exponential growth, as a function of the number of variables, of the dimension a linear subspace needs in order to achieve a desired relative improvement in the accuracy of approximation of the optimal solution value. The proof is constructive and is obtained by exhibiting a family of infinite-dimensional optimization problems for which this happens, namely those with quadratic functional and spherical constraint. The results provide a theoretical motivation for the search of alternative solution methods, such as the so-called “extended Ritz method,” to deal with the curse of dimensionality.Giorgio Gneccogiorgio.gnecco@imtlucca.it2016-02-26T12:26:43Z2017-03-21T10:32:36Zhttp://eprints.imtlucca.it/id/eprint/3122This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31222016-02-26T12:26:43ZOptimal design of auxetic hexachiral metamaterials with local resonatorsA parametric beam lattice model is formulated to analyse the propagation properties of elastic in-plane waves in an auxetic material based on a hexachiral topology of the periodic cell, equipped with inertial local resonators. The Floquet-Bloch boundary conditions are imposed on a reduced order linear model in the only dynamically active degrees-offreedom. Since the resonators can be designed to open and shift band gaps, an optimal design, focused on the largest possible gap in the low-frequency range, is achieved by solving a maximization problem in the bounded space of the significant geometrical and mechanical parameters. A local optimized solution, for a the lowest pair of consecutive dispersion curves, is found by employing the globally convergent version of the Method of Moving asymptotes, combined with Monte Carlo and quasi-Monte Carlo multi-start techniques.Andrea Bacigalupoandrea.bacigalupo@imtlucca.itMarco LepidiGiorgio Gneccogiorgio.gnecco@imtlucca.itLuigi Gambarotta2016-02-26T12:10:01Z2016-02-26T12:10:01Zhttp://eprints.imtlucca.it/id/eprint/3119This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31192016-02-26T12:10:01ZAutomatic Classification of Leading Interactions in a String QuartetThe aim of the present work is to analyze automatically the leading interactions between the musicians of a string quartet, using machine learning techniques applied to nonverbal features of the musicians behavior, which are detected through the help of a motion capture system. We represent these interactions by a graph of influence of the musicians, which displays the relations is following and is not following with weighted directed arcs. The goal of the machine learning problem investigated is to assign weights to these arcs in an optimal way. Since only a subset of the available training examples are labeled, a semisupervised support vector machine is used, which is based on a linear kernel to limit its model complexity. Specific potential applications within the field of human-computer interaction are also discussed, such as e-learning, networked music performance, and social active listening.Floriane DardardGiorgio Gneccogiorgio.gnecco@imtlucca.itDonald Glowinski2016-02-24T12:03:56Z2016-12-19T10:00:37Zhttp://eprints.imtlucca.it/id/eprint/3113This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/31132016-02-24T12:03:56ZAsymptotics for randomly reinforced urns with random barriersAn urn contains black and red balls. Let Zn be the proportion of black balls at time n and 0≤L<U≤1 random barriers. At each time n, a ball bn is drawn. If bn is black and Zn-1<U, then bn is replaced together with a random number Bn of black balls. If bn is red and Zn-1>L, then bn is replaced together with a random number Rn of red balls. Otherwise, no additional balls are added, and bn alone is replaced. In this paper we assume that Rn=Bn. Then, under mild conditions, it is shown that Zn→a.s.Z for some random variable Z, and Dn≔√n(Zn-Z)→Patrizia BertiIrene Crimaldiirene.crimaldi@imtlucca.itLuca PratelliPietro Rigo2016-02-12T11:34:27Z2016-02-12T11:34:27Zhttp://eprints.imtlucca.it/id/eprint/3059This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/30592016-02-12T11:34:27ZRadiotherapy dose enhancement using {BNCT} in conventional {LINACs} high-energy treatment: Simulation and experimentAbstractAim To employ the thermal neutron background that affects the patient during a traditional high-energy radiotherapy treatment for {BNCT} (Boron Neutron Capture Therapy) in order to enhance radiotherapy effectiveness. Background Conventional high-energy (15–25 MV) linear accelerators (LINACs) for radiotherapy produce fast secondary neutrons in the gantry with a mean energy of about 1 MeV due to (γ, n) reaction. This neutron flux, isotropically distributed, is considered as an unavoidable undesired dose during the treatment. Considering the moderating effect of human body, a thermal neutron fluence is localized in the tumour area: this neutron background could be employed for {BNCT} by previously administering 10B-Phenyl-Alanine (10BPA) to the patient. Materials and methods Monte Carlo simulations (MCNP4B-GN code) were performed to estimate the total amount of neutrons outside and inside human body during a traditional X-ray radiotherapy treatment. Moreover, a simplified tissue equivalent anthropomorphic phantom was used together with bubble detectors for thermal and fast neutron to evaluate the moderation effect of human body. Results Simulation and experimental results confirm the thermal neutron background during radiotherapy of 1.55E07 cm−2 Gy−1. The {BNCT} equivalent dose delivered at 4 cm depth in phantom is 1.5 mGy-eq/Gy, that is about 3 Gy-eq (4 of X-rays dose) for a 70 Gy {IMRT} treatment. Conclusions The thermal neutron component during a traditional high-energy radiotherapy treatment could produce a localized {BNCT} effect, with a localized therapeutic dose enhancement, corresponding to 4 or more of photon dose, following tumour characteristics. This {BNCT} additional dose could thus improve radiotherapy, acting as a localized radio-sensitizer.Katia AlikaniotisOscar BorlaValeria MontiGianna Vivaldogianna.vivaldo@imtlucca.itAlba ZaniniGianrossano Giannini2016-02-12T11:24:59Z2016-02-12T11:24:59Zhttp://eprints.imtlucca.it/id/eprint/3057This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/30572016-02-12T11:24:59ZA novel algorithm for the calculation of physical and biological irradiation quantities in scanned ion beam therapy: the beamlet superposition approachThe calculation algorithm of a modern treatment planning system for ion-beam radiotherapy should ideally be able to deal with different ion species (e.g. protons and carbon ions), to provide relative biological effectiveness (RBE) evaluations and to describe different beam lines. In this work we propose a new approach for ion irradiation outcomes computations, the beamlet superposition (BS) model, which satisfies these requirements. This model applies and extends the concepts of previous fluence-weighted pencil-beam algorithms to quantities of radiobiological interest other than dose, i.e. RBE- and LET-related quantities. It describes an ion beam through a beam-line specific, weighted superposition of universal beamlets. The universal physical and radiobiological irradiation effect of the beamlets on a representative set of water-like tissues is evaluated once, coupling the per-track information derived from FLUKA Monte Carlo simulations with the radiobiological effectiveness provided by the microdosimetric kinetic model and the local effect model. Thanks to an extension of the superposition concept, the beamlet irradiation action superposition is applicable for the evaluation of dose, RBE and LET distributions. The weight function for the beamlets superposition is derived from the beam phase space density at the patient entrance. A general beam model commissioning procedure is proposed, which has successfully been tested on the CNAO beam line. The BS model provides the evaluation of different irradiation quantities for different ions, the adaptability permitted by weight functions and the evaluation speed of analitical approaches. Benchmarking plans in simple geometries and clinical plans are shown to demonstrate the model capabilities.G. RussoA. AttiliG. BattistoniD. BertrandF. BourhalebF. CappucciM. CioccaA. MairaniF. M. MilianS. MolinelliM. C. MoroneS. MuraroT. OrtsV. PateraP. SalaE. SchmittGianna Vivaldogianna.vivaldo@imtlucca.itF. Marchetto2016-01-28T11:37:59Z2017-08-04T10:18:57Zhttp://eprints.imtlucca.it/id/eprint/3032This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/30322016-01-28T11:37:59ZSleep reverts changes in human gray and white matter caused by wake-dependent trainingAbstract Learning leads to rapid microstructural changes in gray (GM) and white (WM) matter. Do these changes continue to accumulate if task training continues, and can they be reverted by sleep? We addressed these questions by combining structural and diffusion weighted {MRI} and high-density {EEG} in 16 subjects studied during the physiological sleep/wake cycle, after 12 h and 24 h of intense practice in two different tasks, and after post-training sleep. Compared to baseline wake, 12 h of training led to a decline in cortical mean diffusivity. The decrease became even more significant after 24 h of task practice combined with sleep deprivation. Prolonged practice also resulted in decreased ventricular volume and increased {GM} and {WM} subcortical volumes. All changes reverted after recovery sleep. Moreover, these structural alterations predicted cognitive performance at the individual level, suggesting that sleep's ability to counteract performance deficits is linked to its effects on the brain microstructure. The cellular mechanisms that account for the structural effects of sleep are unknown, but they may be linked to its role in promoting the production of cerebrospinal fluid and the decrease in synapse size and strength, as well as to its recently discovered ability to enhance the extracellular space and the clearance of brain metabolites.Giulio BernardiLuca Cecchettiluca.cecchetti@imtlucca.itFrancesca SiclariAndreas BuchmannXiaoqian YuGiacomo HandjarasMichele BellesiEmiliano Ricciardiemiliano.ricciardi@imtlucca.itSteven R. KecskemetiBrady A. RiednerAndrew L. AlexanderRuth M. BencaMaria Felice GhilardiPietro Pietrinipietro.pietrini@imtlucca.itChiara CirelliGiulio Tononi2016-01-25T09:41:37Z2016-03-22T16:00:42Zhttp://eprints.imtlucca.it/id/eprint/3031This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/30312016-01-25T09:41:37ZFluctuation Theorems for Synchronization of Interacting Polya's urnsWe consider a model of N two-colors urns in which the reinforcement of each urn depends also on the content of all the other urns. This interaction is of mean-field type and it is tuned by a parameter \alpha in [0,1]; in particular, for \alpha=0 the N urns behave as N independent Polya's urns. For \alpha>0 urns synchronize, in the sense that the fraction of balls of a given color converges a.s. to the same (random) limit in all urns. In this paper we study fluctuations around this synchronized regime. The scaling of these fluctuations depends on the parameter \alpha. In particular, the standard scaling t^{-1/2} appears only for \alpha>1/2. For \alpha\geq 1/2 we also determine the limit distribution of the rescaled fluctuations. We use the notion of stable convergence, which is stronger than convergence in distribution.Irene Crimaldiirene.crimaldi@imtlucca.itPaolo Dai Pradaipra@math.unipd.itIda G. Minelliida.minelli@dm.univaq.it2016-01-20T10:27:26Z2016-04-06T10:06:49Zhttp://eprints.imtlucca.it/id/eprint/3025This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/30252016-01-20T10:27:26ZSupervised Learning of Functional Maps for Infarct ClassificationOur submission to the STACOM Challenge at MICCAI 2015 is based on the supervised learning of functional map representation between End Systole (ES) and End Diastole (ED) phases of Left Ventricle (LV), for classifying infarcted LV from the healthy ones. The Laplace-Beltrami eigen-spectrum of the LV surfaces at ES and ED, represented by their triangular meshes, are used to compute the functional maps. Multi-scale distortions induced by the mapping, are further calculated by singular value decomposition of the functional map. During training, the information of whether an LV surface is healthy or diseased is known, and this information is used to train an SVM classifier for the singular values at multiple scales corresponding to the distorted areas augmented with surface area difference of epicardium and endocardium meshes. At testing similar augmented features are calculated and fed to the SVM model for classification. Promising results are obtained on both cross validation of training data as well as on testing data, which encourages us in believing that this algorithm will perform favourably in comparison to state of the art methods.Anirban MukhopadhyayIlkay Oksuzilkay.oksuz@imtlucca.itSotirios A. Tsaftarissotirios.tsaftaris@imtlucca.it2016-01-20T08:51:59Z2016-01-20T09:37:06Zhttp://eprints.imtlucca.it/id/eprint/3020This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/30202016-01-20T08:51:59ZThe spreading of misinformation onlineThe wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web (WWW) also allows for the rapid dissemination of unsubstantiated rumors and conspiracy theories that often elicit rapid, large, but naive social responses such as the recent case of Jade Helm 15––where a simple military exercise turned out to be perceived as the beginning of a new civil war in the United States. In this work, we address the determinants governing misinformation spreading through a thorough quantitative analysis. In particular, we focus on how Facebook users consume information related to two distinct narratives: scientific and conspiracy news. We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.” Indeed, homogeneity appears to be the primary driver for the diffusion of contents and each echo chamber has its own cascade dynamics. Finally, we introduce a data-driven percolation model mimicking rumor spreading and we show that homogeneity and polarization are the main determinants for predicting cascades’ size.Michela Del Vicariomichela.delvicario@imtlucca.itAlessandro BessiFabiana Zollofabiana.zollo@imtlucca.itFabio PetroniAntonio ScalaGuido Caldarelliguido.caldarelli@imtlucca.itH. Eugene StanleyWalter Quattrociocchiwalter.quattrociocchi@imtlucca.it2016-01-13T12:20:09Z2016-09-14T10:21:16Zhttp://eprints.imtlucca.it/id/eprint/2985This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/29852016-01-13T12:20:09ZEreditare il futuro: dilemmi sul patrimonio culturaleIl libro ricostruisce le regole, le prassi e i problemi della gestione del patrimonio culturale, soffermandosi su quattro dilemmi che segnano le politiche di questo settore: pubblico vs privato (cosa è davvero la valorizzazione?); «retenzione» vs esportazione (come circolano le opere d’arte?); in-house vs outsourcing (chi progetta le mostre?); natura vs cultura (qual è il rapporto tra ambiente, paesaggio e cultura?). I dilemmi si intrecciano con l’assetto delle istituzioni chiamate a risolverli, con l’emergere di interessi globali e con la necessità di adottare misure oltre il presente: il patrimonio culturale non è solo memoria del passato, ma anche eredità del futuro.Lorenzo Casinilorenzo.casini@imtlucca.it2016-01-13T12:18:04Z2017-11-23T14:26:46Zhttp://eprints.imtlucca.it/id/eprint/2984This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/29842016-01-13T12:18:04ZComparative Regional Integration: Governance and Legal ModelsComparative Regional Integration: Governance and Legal Models is a groundbreaking comparative study on regional or supranational integration through international and regional organizations. It provides the first comprehensive and empirically based analysis of governance systems by drawing on an original sample of 87 regional and international organizations. The authors explain how and why different organizations select specific governance processes and institutional choices, and outline which legal instruments – regulatory, organizational or procedural – are adopted to achieve integration. They reveal how different objectives influence institutional design and the integration model, for example a free trade area could insist on supremacy and refrain from adopting instruments for indirect rule, while a political union would rather engage with all available techniques. This ambitious work merges different backgrounds and disciplines to provide researchers and practitioners with a unique toolbox of institutional processes and legal mechanisms, and a classification of different models of regional and international integration.Carlos ClosaLorenzo Casinilorenzo.casini@imtlucca.it2016-01-12T14:42:35Z2016-01-12T14:42:35Zhttp://eprints.imtlucca.it/id/eprint/2981This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/29812016-01-12T14:42:35ZAccess to Medicines and European Market IntegrationIn this paper we document a process of price convergence in the European market for pharmaceuticals and relate it to access to innovative medicines in individual countries. The EU is a peculiar case study, where free circulation of goods exists, but pricing policies are designed and implemented by Member States. Thanks to a unique census database on product sales and launches
for fifteen EU countries, we detect a process of price convergence, both in nominal and in real
terms. Therefore, we find that a faster rate of price convergence and a lower income per capita are
associated with stronger delays in launches of new medicines. Moreover, country delays tend to
be higher for innovative and first in class chemical compounds. Our results suggest that inefficiencies arise from drugs regulation, when countries widely differ in income per capita, public finance
sustainability conditions, and regulatory frameworks. Policies of external reference pricing tend to
exacerbate welfare losses. A policy of differential pricing is suggested, in order to take into account
both therapeutic value and willingness to pay.Fabio Pammollif.pammolli@imtlucca.itArmando Rungiarmando.rungi@imtlucca.it2016-01-11T09:11:26Z2016-02-26T15:56:01Zhttp://eprints.imtlucca.it/id/eprint/2978This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/29782016-01-11T09:11:26ZBasis Risk in Static versus Dynamic Longevity Risk HedgingThis paper provides a simple model for basis risk in a longevity framework, by separating common and idiosyncratic risk factors. Basis risk is captured by a single parameter, that measures the co-movement between the portfolio and the reference population. In this framework, the paper sets out the static, swap-based hedge for an annuity, and compares it with the dynamic, delta-based hedge, achieved using longevity bonds. We assume that the longevity intensity is distributed according to a CIR-type process and provide closed-form derivatives prices and hedges, also in the presence of an analogous CIR process for interest rate risk.Clemente De RosaElisa LucianoLuca Regisluca.regis@imtlucca.it2015-11-02T13:34:20Z2016-04-13T08:48:45Zhttp://eprints.imtlucca.it/id/eprint/2790This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/27902015-11-02T13:34:20ZTechnology Diffusion on the International Trade NetworkTechnological innovations generate knowledge spillovers—non-innovators benefit through the adoption, imitation, and extension of new technologies. International trade facilitates technology diffusion by providing importing countries access to technical knowledge that they can potentially internalize. Previous studies of the effect of trade on technology diffusion typically only consider the impact of direct (bilateral) trade on indirect measures of technology (e.g., total factor productivity). We contend that the analysis of trade's impact on technology diffusion would be more accurately assessed by using direct measures of specific technologies (e.g., intensity levels) and by allowing for the influence of both the direct and indirect effects of trade in the analysis. The latter is accomplished by modeling the international trade system as a weighted network, which quantifies both direct and indirect trade linkages. Combining trade data with data on the adoption of specific technologies, we find that the network effects of trade play a significant role in technology diffusion. In most cases, countries that are better-connected on the trade network have higher technology intensities. Further support for the importance of trade is provided by the finding that for “outdated” technologies, better-connected countries have lower technology intensities because of their adoption of newer, substitute technologies.Gary D. FerrierJavier A. ReyesZhen Zhuzhen.zhu@imtlucca.it2015-10-28T14:52:49Z2016-05-04T09:46:22Zhttp://eprints.imtlucca.it/id/eprint/2787This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/27872015-10-28T14:52:49ZA Quadratic Programming Algorithm Based on Nonnegative Least Squares with Applications to Embedded Model Predictive ControlThis paper proposes an active set method based on nonnegative least squares (NNLS) to solve strictly convex quadratic programming (QP) problems, such as those that arise in Model Predictive Control (MPC). The main idea is to rephrase the QP problem as a Least Distance Problem (LDP) that is solved via a NNLS reformulation. While the method is rather general for solving strictly convex QP’s subject to linear inequality constraints, it is particularly useful for embedded MPC because (i) is very fast, compared to other existing state-of-theart QP algorithms, (ii) is very simple to code, requiring only basic arithmetic operations for computing LDLT decompositions recursively to solve linear systems of equations, (iii) contrary to iterative methods, provides the solution or recognizes infeasibility in a finite number of steps.Alberto Bemporadalberto.bemporad@imtlucca.it2015-10-19T10:16:18Z2016-02-26T11:47:36Zhttp://eprints.imtlucca.it/id/eprint/2778This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/27782015-10-19T10:16:18ZSupervised and semi-supervised classifiers for the detection of flood-prone areasSupervised and semi-supervised machine-learning techniques are applied and compared for the recognition of the flood hazard. The learning goal consists in distinguishing between flood-exposed and marginal-risk areas. Kernel-based binary classifiers using six quantitative morphological features, derived from data stored in digital elevation models, are trained to model the relationship between morphology and the flood hazard. According to the experimental outcomes, such classifiers are appropriate tools when one is interested in performing an initial low-cost detection of flood-exposed areas, to be possibly refined in successive steps by more time-consuming and costly investigations by experts. The use of these automatic classification techniques is valuable, e.g., in insurance applications, where one is interested in estimating the flood hazard of areas for which limited labeled information is available. The proposed machine-learning techniques are applied to the basin of the Italian Tanaro River. The experimental results show that for this case study, semi-supervised methods outperform supervised ones when—the number of labeled examples being the same for the two cases—only a few labeled examples are used, together with a much larger number of unsupervised ones.Giorgio Gneccogiorgio.gnecco@imtlucca.itRita Morisirita.morisi@imtlucca.itGiorgio RothMarcello SanguinetiAngela Celeste Taramasso2015-09-04T10:26:42Z2016-05-05T13:50:59Zhttp://eprints.imtlucca.it/id/eprint/2745This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/27452015-09-04T10:26:42ZAn interactive tool for semi-automated leaf annotationHigh throughput plant phenotyping is emerging as a necessary step towards meeting agricultural demands of the future. Central to its success is the development of robust computer vision algorithms that analyze images and extract phenotyping information to be associated with genotypes and environmental conditions for identifying traits suitable for further development. Obtaining leaf level quantitative data is important towards understanding better this interaction. While certain efforts have been made to obtain such information in an automated fashion, further innovations are necessary. In this paper we present an annotation tool that can be used to semi-automatically segment leaves in images of rosette plants. This tool, which is designed to exist in a stand-alone fashion but also in cloud based environments, can be used to annotate data directly for the study of plant and leaf growth or to provide annotated datasets for learning-based approaches to extracting phenotypes from images. It relies on an interactive graph-based segmentation algorithm to propagate expert provided priors (in the form of pixels) to the rest of the image, using the random walk formulation to find a good per-leaf segmentation. To evaluate the tool we use standardized datasets available from the LSC and LCC 2015 challenges, achieving an average leaf segmentation accuracy of almost 97% using scribbles as annotations. The tool and source code are publicly available at http://www.phenotiki.com and as a GitHub repository at https://github.com/phenotiki/LeafAnnotationTool.Massimo Minervinimassimo.minervini@imtlucca.itMario Valerio Giuffridavalerio.giuffrida@imtlucca.itSotirios A. Tsaftarissotirios.tsaftaris@imtlucca.it2015-09-04T10:24:54Z2016-05-05T13:48:56Zhttp://eprints.imtlucca.it/id/eprint/2744This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/27442015-09-04T10:24:54ZLearning to Count Leaves in Rosette PlantsCounting the number of leaves in plants is important for plant phenotyping, since it can be used to assess plant growth stages. We propose a learning-based approach for counting leaves in rosette (model) plants. We relate image-based descriptors learned in an unsupervised fashion to leaf counts using a supervised regression model. To take advantage of the circular and coplanar arrangement of leaves and also to introduce scale and rotation invariance, we learn features in a log-polar representation. Image patches extracted in this log-polar domain are provided to K-means, which builds a codebook in a unsupervised manner. Feature codes are obtained by projecting patches on the codebook using the triangle encoding, introducing both sparsity and specifically designed representation. A global, per-plant image descriptor is obtained by pooling local features in specific regions of the image. Finally, we provide the global descriptors to a support vector regression framework to estimate the number of leaves in a plant. We evaluate our method on datasets of the \textit{Leaf Counting Challenge} (LCC), containing images of Arabidopsis and tobacco plants. Experimental results show that on average we reduce absolute counting error by 40% w.r.t. the winner of the 2014 edition of the challenge -a counting via segmentation method. When compared to state-of-the-art density-based approaches to counting, on Arabidopsis image data ~75% less counting errors are observed. Our findings suggest that it is possible to treat leaf counting as a regression problem, requiring as input only the total leaf count per training image.Mario Valerio Giuffridavalerio.giuffrida@imtlucca.itMassimo Minervinimassimo.minervini@imtlucca.itSotirios A. Tsaftarissotirios.tsaftaris@imtlucca.it2015-07-24T12:44:02Z2016-04-13T08:34:53Zhttp://eprints.imtlucca.it/id/eprint/2730This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/27302015-07-24T12:44:02ZApproximate reduction of heterogenous nonlinear models with differential hullsWe present a model reduction technique for a class of nonlinear ordinary differential equation (ODE) models of heterogeneous systems, where heterogeneity is expressed in terms of classes of state variables having the same dynamics structurally, but which are characterized by distinct parameters. To this end, we first build a system of differential inequalities that provides lower and upper bounds for each original state variable, but such that it is homogeneous in its parameters. Then, we use two methods for exact aggregation of ODEs to exploit this homogeneity, yielding a smaller model of size independent of the number of heterogeneous classes. We apply this technique to two case studies: a multiclass queuing network and a model of epidemics spread.Max Tschaikowskimax.tschaikowski@imtlucca.itMirco Tribastonemirco.tribastone@imtlucca.it2015-06-23T13:55:35Z2017-07-18T09:46:48Zhttp://eprints.imtlucca.it/id/eprint/2711This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/27112015-06-23T13:55:35ZDeveloping sustainable cultural policies in Turkey: an investigation of public opinion on the theatre sceneThe arts scene in Turkey has been witnessing many discussions with the revealing of the governmental reform agenda on the state support model for the arts that includes establishment of an arts council type institution, the closure of the State Theatres and, the State Opera and Ballet. Nevertheless, despite strong public criticism on this reform agenda, there has never been any comprehensive research to reflect the public opinion. Therefore, this study aims to contribute to recent discussions by providing data on public opinion regarding such a fundamental change, with a particular focus on theatre. Towards this end, a survey was conducted in Istanbul. The findings demonstrate that the majority, including both users and non-users of theatre, value the State Theatres and are in favour of sustaining it. There is also a common belief that in case of the State Theatres’ closure, the private theatres cannot undertake its public mission.Yesim Tonga Uriarteyesim.tonga@imtlucca.it2015-05-12T10:26:46Z2017-01-26T14:22:08Zhttp://eprints.imtlucca.it/id/eprint/2674This item is in the repository with the URL: http://eprints.imtlucca.it/id/eprint/26742015-05-12T10:26:46ZA Hybrid Model Predictive Control Approach to Attitude Control with Minimum-Impulse-Bit ThrustersThis paper studies an important aspect of attitude control for a launcher's upper stage: the minimum impulse bit (MIB), that is, the minimum torque that can be exerted by the thrusters. We model this effect using principles of hybrid systems theory and we design a hybrid model predictive control scheme for the attitude control of a launcher during its long coasting period, aiming at minimizing the number of thrusters' actuations. We apply the proposed methodology to a nonlinear model of a typical upper stage with multi-payload capability. Pantelis Sopasakispantelis.sopasakis@imtlucca.itDaniele Bernardinidaniele.bernardini@imtlucca.itHans StrauchSamir BennaniAlberto Bemporadalberto.bemporad@imtlucca.it