Karina Mochetti and Ricardo Dahab.

Abstract:In an Identity-Based Encryption Scheme, the public key is based on unique user's information, such as an email address, so it is possible to perform the decryption without public-key certification. Ideal lattices are a generalization of cyclic lattices, in which the lattice corresponds to ideals in a ring. They allow some improvements in the parameters size and multiplication complexity. In this paper, we present a version of the lattice-based (H)IBE scheme by Agrawal, Boneh, Boyen (Eurocrypt'10) for ideal lattices. As the underlying (H)IBE scheme, our scheme is shown to be weak selective secure based on the difficulty of the Learning With Errors Problem (LWE), but our new primitive has smaller public keys.

Fernando Granha Jeronimo and Arnaldo Vieira Moura.

Abstract:Entanglement has a dual role in quantum computation and information. It is an important resource in protocols such as quantum teleportation and superdense coding. On the other hand, it can potentially reduce the soundness in quantum Multi-prover Merlin-Arthur proof systems. Thus, understanding and controlling entanglement is of primary importance. To achieve this goal a super-operator capable of breaking entanglement, called a disentangler, has been proposed, together with a variety of quantum de Finetti Theorems. In this work, we study some limits of these approaches using computational hardness notions. We rule out the existence of some disentanglers and de Finetti Theorems based on some plausible hardness assumptions.

Fernando Granha Jeronimo and Arnaldo Vieira Moura.

Abstract:Classically, extending the Merlin-Arthur complexity class to multiple provers does not increase its computational power since multiple Merlins can be simulated by a single one. However, in the quantum model an analogous result may no longer hold if the provers are assumed to be unentangled. Surprisingly, NP-complete problems admit quantum Multi-prover Merlin-Arthur protocols in which only two unentangled witnesses of logarithm size are used. In this work, we extend one protocol so that it can simulate generic classical Probabilistic Checkable Proofs (PCP) verifiers. By combining this new protocol with specific PCPs theorems, it is possible to recover known results in a simplified way. The first result is a Two-prover QMA protocol for with logarithmic size witnesses and a completeness soundness gap of for any . The second one is a known characterization of NEXP.

Adilson Luiz Bonifacio and Arnaldo Vieira Moura.

Abstract:One desirable property of test suites is completeness. Roughly, completeness guarantees that non-equivalent implementations under test will always be identified. Several approaches proposed sufficient, and sometimes also necessary, conditions on the specification model and on the test suite in order to guarantee completeness. Usually, these approaches impose several restrictions on the specification and on the implementations, such as requiring them to be reduced or complete. Further, test cases are required to be non-blocking on both the specification and the implementation. In this work we deal with the more general scenario where test cases can be blocking. We propose a new notion of equivalence, define a new notion that captures completeness, and move to characterize test suite completeness in this new scenario. A related issue that concerns test suite completeness is the size of implementations under test. Usually, earlier works constrain implementations to have at most the same number of states as the given specification. We establish an upper bound on the number of states of the implementations beyond which the test suite will not be complete, in the classical sense.

Rachid Rebiha, Arnaldo V. Moura, and Nadir Matringe.

Abstract:The termination problem for affine programs over the integers was left open in [1]. For more that a decade, it has been considered and cited as a challenging open problem. To the best of our knowledge, we present here the most complete response to this issue: we show that termination for affine programs over is decidable under an assumption holding for almost all affine programs, except for an extremely small class of zero Lesbegue measure. We use the notion of asymptotically non-terminating initial variable values (, for short) for linear loop programs over . Those values are directly associated to initial variable values for which the corresponding program does not terminate. We reduce the termination problem of linear affine programs over the integers to the emptiness check of a specific set of initial variable values. For this class of linear or affine programs, we prove that the corresponding set is a semi-linear space and we provide a powerful computational methods allowing the automatic generation of these sets. Moreover, we are able to address the conditional termination problem too. In other words, by taking set complements, we obtain a precise under-approximation of the set of inputs for which the program does terminate.

Ariadne Carvalho - Divino César - Edelson Constantino - Elisa Rodrigues - Ewerton Martins - Ewerton Silva - Fabio A. Faria - Flavio Nicastro - Greice Mariano - Gustavo Alkmim - Heiko Hornung - Juliana Borin - Leonardo Tampelini - Maxiwell Garcia - Priscila T. M. Saito - Roberto Pereira - Samuel Buchdid - Vanessa Maike - Walisson Pereira.

Resumo:Este relatório técnico contém os resumos de 33 trabalhos apresentados no IX Workshop de Teses, Dissertações e Trabalhos de Iniciação Científica, do Instituto de Computação da Universidade Estadual de Campinas (WTD-IC-UNICAMP 2014). O Workshop ocorreu nos dias 11 e 12 de agosto de 2014 e contou com mais de 100 participantes, entre ouvintes, apresentadores de trabalhos e organizadores. Na ocasião houve 31 apresentações orais e 12 pôsteres. Aos alunos foi dada a possibilidade de escolher a forma de apresentação (oral, pôster, ou ambas). A publicação dos resumos, sob forma de relatório técnico, tem por objetivo divulgar os trabalhos em andamento e registrar, de forma sucinta, o estado da arte da pesquisa do IC no ano de 2014.

Cleber Mira and João Meidanis.

Abstract:Measuring the minimum number of rearrangement events that transforms a genome into another is an important tool for the comparative analysis of genomes. We propose a new algorithm for finding a minimum sequence of Fissions, Fusions, and Signed Reversals that transforms a genome into another. The algorithm is based on representing a chromosome as a cycle (a circular sequence) instead of a mapping. Thus a genome is a product of cycles instead of the usual representation as a set of mappings. By representing a chromosome as a cycle, rearrangement events like fissions, fusions, and signed reversals are performed in constant running time. Besides the change in the representation, the algorithm uses a hash table to deal with the problem of using the gene names for indexes. The total time complexity is , where is the number of genes and is the time spent to retrieve data for a gene in the hash table (ideally is a constant). We also present a formula for the minimum number of Fissions, Fusions, and Signed Reversals that can be calculated in running time.

Cleber Mira, Paulo Viadanna, Maria Angélica Souza, Arnaldo Moura, João Meidanis, Gabriel Lima, and Renato P. Bossolan.

Abstract:The problem of choosing from a set of projects which ones should be executed and when they should start, depending on several restrictions involving project costs, risks, limited resources, dependencies among projects, and aiming at different, even conflicting, goals is known as the project portfolio selection (PPS) problem. We study a particular version of the PPS problem stemming from the operation of a real power generation company. It includes distinct categories of resources, intricate dependencies between projects, which are specially important for the management of power plants, and the prevention of risks. We present an algorithm based on the GRASP meta-heuristic for finding better results than a manual solution produced by specialists. The algorithm yielded solutions that decreased the risk by , as measured by the company’s standard methodology.

Lucas Moutinho Bueno and Jorge Stolfi.

Abstract:We describe an algorithm to triangulate a general map on an arbitrary surface in such way that the resulting triangulation is vertex-colorable with three colors. (Three-colorable triangulations can be efficiently represented and manipulated by the GEM data structure of Montagner and Stolfi.) The standard solution to this problem is the barycentric subdivision, which produces triangles when applied to a map with edges, such that of them are border edges (adjacent to only one face). Our algorithm yields a subdivision with at most triangles, where is the Euler Characteristic of the surface; or at most triangles if all faces of the map have the same degree. Experiment results show that the resulting triangulations have, on the average, significantly fewer triangles than these upper bounds.

Rachid Rebiha, Arnaldo V. Moura, and Nadir Matringe.

Abstract:We present the notion of asymptotically non-terminating initial variable values for linear loop programs. Those values are directly associated to initial variable values for which the corresponding program does not terminate. Our theoretical contributions provide us with powerful computational methods for automatically generating sets of asymptotically non-terminating initial variable values. Such sets are represented symbolically and exactly by a semi-linear space, e.g., characterized by conjunctions and disjunctions of linear equalities and inequalities. Moreover, by taking their complements, we obtain a precise under-approximation of the set of inputs for which the program does terminate, thus reducing the termination problem of linear programs to the emptiness check of a specific set of asymptotically non-terminant initial variable values. Our static input data analysis is not restricted only to programs where the variables are interpreted over the reals. We extend our approach and provide new decidability results for the termination problem of affine integer and rational programs.

Rachid Rebiha, Arnaldo V. Moura, and Nadir Matringe.

Abstract:We present necessary and sufficient conditions for the termination of linear homogeneous programs. We also develop a powerful computational method to determine termination for this class of programs. Our complete characterization of termination for such programs is based on linear algebraic methods. We reduce the verification of the termination problem to checking the orthogonality of a well determined vector space and a certain vector, both related to loops in the program. Moreover, we provide theoretical results and symbolic computational methods guaranteeing the soundness, completeness and numerical stability of the approach. Finally, we show that it is enough to interpret variable values over a specific countable number field, or even over its ring of integers, when one wants to check termination over the reals.

Fernando Granha Jeronimo and Arnaldo Vieira Moura.

Abstract:A 2QCFA is an automata model that combines constant size quantum and classical memories. This model is more powerful than any 2DFA since it recognizes some non-regular and even non-context free languages. In the classical paradigm, it is possible to increase the computational power of a 2DFA by using markers (pebbles) or augmenting the number of heads. Therefore, it is natural to investigate their quantum analogues. In this work, we propose a new general marking automaton based on the 2QCFA model. We prove that this model is more powerful than its classical analogues in terms of computability and complexity. Moreover, we answer affirmatively to an open question regarding the quantum-classical computability separation of multi-head 2QCFA and 2DFA.

Eduardo Miranda and André Santanchè.

Abstract:A large number of studies in biology, including those involving phylogenetic trees reconstruction, result in the production of a huge amount of data - e.g., phenotype descriptions, morphological data matrices, etc. Biologists increasingly face a challenge and opportunity of effectively discovering useful knowledge crossing and comparing several pieces of information, not always linked and integrated. Ontologies are one of the promising choices to address this challenge. However, the existing digital phenotypic descriptions are stored in semi-structured formats, making extensive use of natural language. This technical report is related to a research developed by us [] to addresses this problem, adding an intermediate step between semi-structured phenotypic descriptions and ontologies. It remodels semi-structured descriptions to a graph abstraction in which the data are linked. Graph transformations subsidize the transition from semi-structured data representation to a more formalized representation with ontologies. The present technical report drills down implementation details of our system. It provides a module to ingest phylogenetic trees and phenotype descriptions - represented in semi-structured formats - into a graph database. Additionally, two approaches to combine distinct data sources are presented and an algorithm to trace changes in phylogenetic traits of trees.

Elisa de Cássia Silva Rodrigues, Jorge Stolfi, and Anamaria Gomide.

Abstract:We describe a method for interactive editing of -continuous deformations of the plane defined by splines of degree 5 on an arbitrary triangular mesh. The continuity constraints and the user editing actions are combined using weighted and constrained least squares instead of the usual finite element approach. We worked with exact integer arithmetic in order to detect and eliminate redundancies in the set of constraints. To validate our method, we used it as part of an friendly interactive editor for ``2.5D'' space deformations. This editor has been used to deform 3D models of non-rigid living microscopic organisms as seen in actual optical microscope images.

Fábio Dias and Neucimar Jerônimo Leite.

Abstract:In this work, we approach the problem of the tridimensional reconstruction of the trajectory of an object using unsynchronized cameras. While most methods from the literature try to measure, as accurately as possible, the timing difference between the cameras, we chose a very different path: we ignore most, or all, of the time information from the videos. The idea is quite simple. Each pair of camera and projected trajectory generate a surface on the volume. The intersection of these curves is the desired result. Since this intersection can be very complex, we considered a Monte Carlo approach for the reconstruction, using random tridimensional points to estimate the region of intersection. Therefore, any camera calibration schema can be used. These points are later used to compute a continuous curve, which is the final result of the method. We compared this method to a very simple reconstruction approach, which assumes the frames are synchronized, and obtained outstanding results. Our implementation and data are freely available onhttps://code.google.com/p/ucr-timeless/.

Rachid Rebiha, Arnaldo V. Moura, and Nadir Matringe.

Abstract:We present the new notion of asymptotically non-terminant initial variable values for linear loop programs. Those specific values are directly associated to initial variable values for which the corresponding program does not terminate. Our theoretical contributions provide us with powerful computational methods for automatically generating sets of asymptotically non-terminant initial variable values. Moreover, we reduce the termination problem of linear programs to the emptyness check of a specific set of asymptotically non-terminant initial variable values. Those sets are represented symbolically and exactly by a semi-linear space, e.g., conjunctions and disjunctions of linear equalities and inequalities. Also, by taking the complement of these sets, we obtain a precise under-approximations of input sets for which linear programs do.

Ambra Melloni, Marco Tagliasacchi, Stefano Tubaro, Filipe de O.Costa, Marina Oikawa, Zanoni Dias, Siome Goldenstein, and Anderson Rocha.

Abstract:This work is a report about the effect of denoising filters applied to image phylogeny, as conclusion of the collaboration between Politecnico di Milano and UNICAMP, inside the REWIND project. The purpose of image phylogeny is discovering dependencies among a group of images representing similar or equal contents in order to construct a tree describing image relationships. We applied a family of image processing techniques to a set of images, in order to create parental relationship. Operating in the wavelet domain, it is possible to apply denoise filters in the images, separating each image in two contributions: a component part (the denoised image) and a randomness part (the noise). We evaluate the effectiveness of this method for three different database, using subsets of the image processing functions: in this report, we show that geometric transformation imperfections influence the results of denoising algorithm highly, when reconstructing trees for component and randomness parts.

Adilson Luiz Bonifacio and Arnaldo Vieira Moura.

Abstract:One of the important tasks in model-based testing is checking completeness of test suites. In this paper we first extend some known sufficient conditions for test suite completeness by also allowing partial implementations. We also study a new notion of equivalence, and show that the same conditions are still sufficient when treating only complete implementations, but not when we also allow partial implementation models.

Instituto de Computação ::
Universidade Estadual de Campinas
Av. Albert Einstein, 1251 - Cidade Universitária Zeferino Vaz • 13083-852 Campinas, SP - Brasil • Fone: [19] 3521-5838 |