Technical Reports Published in 2005

  • IC-05-34 pdf bib
    Chksim: A distributed checkpointing simulator.
    Gustavo M. D. Vieira and Luiz E. Buzato.
    December 2005. In English, 12 pages.

    Abstract: ChkSim is a portable tool to simulate the execution of checkpointing algorithms in distributed applications. It provides quantitative data to system and algorithm designers, enabling the comparative assessment of these algorithms. This report describes the ChkSim simulation model, its software architecture and user manual.

  • IC-05-33 pdf bib
    Admission control and tuning of medium access parameters in 802.11e networks.
    Juliana Freitag, Nelson L. S. da Fonseca, and José F. de Rezende.
    December 2005. In English, 26 pages.

    Abstract: The IEEE 802.11e standard is intended to support applications with QoS requirements. However, such provisioning cannot be achieved when the network load is high. In this paper, the effectiveness of two mechanisms for measurement-based admission control in 802.11e networks is evaluated. In addition, a new control mechanism which dynamically tunes parameters of the 802.11e contention-based access method is introduced. The proposed mechanism aims at providing QoS as well as ameliorating the problem of delay asymmetry.

  • IC-05-32 pdf bib
    Semidefinite programming based algorithms for the ratio cut problem.
    Luis A. A. Meira and Flávio Miyazawa.
    December 2005. In English, 23 pages.

    Abstract: In this paper we analyze a known relaxation for the Ratio Cut Problem based on positive semidefinite constraints and present a branch and bound algorithm and heuristics based on this relaxation. The relaxation and the algorithms were tested on small and moderate sized instances. The relaxation leads to values ​​very close to the optimum solution values. The exact algorithm could obtain solutions for small and moderate sized instances and the best heuristics obtained optimum or almost optimum solutions for all tested instances. We provide interesting characteristics for each one of these heuristics. We also compared the semidefinite based branch and bound algorithm with a commercial integer programming solver.

  • IC-05-31 pdf bib
    The linear-time approach for graph-cut object detection.
    AX Falcão, PAV Miranda, and A. Rocha.
    December 2005. In English, 11 pages.

    Abstract: Image segmentation using graph cuts have become very popular in the last years. These methods are usually computationally expensive, even with hard constraints (seed pixels). We present a solution that requires only internal seeds and runs in time proportional to the number of pixels. Our method computes an ordered region growing where the propagation order of each pixel is proportional to the cost of an optimum path from the seed set to that pixel. Each pixel defines a region which includes it and all pixels with lower propagation order. The boundary of each region is a possible cut boundary, whose cut measure is also computed and assigned to the corresponding pixel on-the-fly. The object is obtained by selecting the pixel with minimum-cut measure and all pixels within its respective cut boundary. The method works with various cut measures and is evaluated using several experiments.

  • IC-05-30 pdf bib
    Object detection using tree pruning.
    AX Falcão, PAV Miranda, and FPG Bergo.
    December 2005. In English, 10 pages.

    Abstract: We propose a new approach for object detection using the image foresting transform (IFT). For a given image with seed pixels inside the object, the IFT computes an optimum-path forest in a graph derived from the image, such that object and background are connected by a few optimum paths (leaking paths). The leaking paths cross the object's boundary through its "most weakly connected" parts (called leaking pixels). The topology of the forest is exploited to identify the leaking pixels and eliminate their subtrees, such that the remaining forest defines the object. The method is called tree pruning. We present tree pruning with an automatic approach to detect leaking pixels, which does not require parameters; show several examples; discuss its limitations; and evaluate the method for automatic detection of license plates using a database with 990 images.

  • IC-05-29 pdf bib
    Analysis of embedded sequences in ESTs Projects.
    C. Baudet and Z. Dias.
    November 2005. In English, 13 pages.

    Abstract: Slippage is an important sequencing problem that can occur in EST projects. However, there are very few studies about it. In this work we propose three new methods to detect slippage artifacts: Arithmetic Mean Method, Geometric Mean Method, and Echo Coverage Method. Each method is simple and has two different strategies for processing sequences: suffix and subsequence. Using the 291689 EST sequences produced in the SUCEST project, we performed comparative tests between the proposed methods and Telles and Silva Method. The subsequence strategy is better than the suffix strategy because it is not anchored at the end of the sequence, so it is more flexible to find slippage at the beginning of the EST. Comparing with the Telles and Silva Method, the advantage of our methods is that they do not discard the majority of the sequences marked as slippage, but, instead of it, only removes the slipped artifact from the sequence. The tests indicate that the Echo Coverage Method with subsequent strategy has the best compromise between slippage detection and calibration easiness.

  • IC-05-28 pdf bib
    GRASP strategies for scheduling activities at oil wells with resource displacement.
    Romulo A. Pereira, Arnaldo V. Moura, and Cid C. de Souza.
    October 2005. In English, 27 pages.

    Abstract: Before promising locations at petroliferous basins become productive oil wells, it is often necessary to complete drilling activities at these locations. The scheduling of such activities must satisfy several conflicting constraints and attain a number of goals. Moreover, resource displacements between wells are also important. We describe a Greedy Randomized Adaptive Search Procedure (GRASP) for the scheduling of oil well development activities with resource displacement. The results are compared with schedules produced by a well accepted constraint programming implementation. Computational experience on real instances indicates that the GRASP implementation is competitive, outperforming the constraint programming implementation.

  • IC-05-27 pdf bib
    A fully resolved consensus between fully resolved phylogenetic trees.
    José Augusto Amgarten Quitzau and João Meidanis.
    October 2005. In English, 16 pages.

    Summary: Currently, there are a considerable number of methods for reconstructing phylogenetic trees. Many of them are capable of creating trees of considerable quality, but none are able to, of course, recover the phylogenetic tree that represents the history of the evolution of living beings. On the other hand, there are a large number of consensus methods that generally group the most common parts of trees in a collection into a single tree. Unfortunately, the use of consensus methods for the reconstruction of phylogenetic trees is a kind of taboo among biologists, who normally see this type of consensus only as a comparator and not as a tree builder.

    In this work, we challenge this taboo by defining a consensus method that builds fully resolved phylogenetic trees based on the most common parts of the fully resolved trees in a given collection. We also present results of simple tests that show that the trees produced by this method are generally better than the trees in the collection used to create them.

    Abstract: Nowadays, there is a considerable number of phylogeny reconstruction methods and many are able to produce reasonable trees, but none of the methods guarantees to reconstruct the true tree. On the other hand, there is a larger number of phylogenetic consensus methods, which usually put together the most common parts of a collection of phylogenetic trees. Unfortunately, there is also a taboo concerning the use of consensus methods to reconstruct phylogenetic trees, because most biologists see the phylogenetic consensus methods mainly as comparators and not as phylogenetic tree constructors.

    In this work, we challenge this taboo by defining a consensus method which builds a fully resolved phylogenetic tree based on the most common parts of fully resolved trees in a given collection. We also present results of simple tests which show that this consensus is usually better than the trees used to build it.

  • IC-05-26 pdf bib
    Integer programming models for the SONET ring assignment problem.
    Elder M. Macambira, Nelson Maculan, and Cid C. de Souza.
    October 2005. In English, 31 pages.

    Abstract: In this paper we consider the SONET ring assignment problem (SRAP) presented by Goldschmidt et al. in (SONET / SDH ring assignment with capacity constraints, Discrete Applied Mathematics, 129: 99-128, 2003). The authors pointed out to the inadequacy of solving SRAP instances using their integer programming formulation and commercial linear programming solvers. Similar experiences with IP models for SRAP are reported by Aringhieriand Dell'Amico (Comparing metaheuristic algorithms for sonet network design problems, Journal of Heuristics, 11 (1): 35-57, 2005). In this paper we presented some variants of IP formulations for SRAP and tested the performance of standard branch-and-bound algorithms implemented in a commercial code when computing those models. The results are significantly better than those reported earlier in the literature. Moreover, we also reformulate the problem as a set partitioning model amended with an additional knapsack constraint. This new formulation has an exponential number of columns and, to solve it, we implemented a branch-and-price / column generation algorithm. Extensive computational experiments with our algorithm showed that it is orders of magnitude faster than its competitors. Hundreds of instances taken from both Aringhieri's and Goldschmidt's papers which were not solved by these authors in hours of computation are solved here to optimality in just a few seconds.

  • IC-05-25 pdf bib
    Study of transaction models for the mobile computing environment.
    Tarcisio da Rocha and Maria Beatriz F. de Toledo.
    September 2005. In Portuguese, 31 pages.

    Summary: Transactions have become a widely used technique to ensure data consistency, reliability and recoverability for distributed applications. In addition to the models aimed at traditional distributed systems, the literature has presented several transaction models for the mobile computing environment. However, most of the proposed models have been suggested and developed for a specific application domain. This makes these models inflexible and unsuitable for other application domains or applications that require greater flexibility. This report presents a survey of transaction models for mobile computing highlighting some of their contributions and limitations. This report also presents a set of requirements that can be useful in the development of a platform to support transactions in the mobile computing environment.

  • IC-05-24 pdf bib
    Fuzzy data in video image object tracking.
    F. A. S. Dias and N. J. Leite.
    September 2005. In English, 13 pages.

    Abstract: In this paper, we introduce a new method for object tracking which takes into account multiple target features and fuzzy logic for knowledge representation and information fusion. The knowledge about the target location is represented by fuzzy matrices, one matrix for each considered feature. Since the set of all included information belongs to the same domain, the result indicating this final location is given by a fusion of matrices, through fuzzy operators, which expresses the combination of all defined target features.

  • IC-05-23 pdf bib
    Requirements for a component-based development environment focused on software architecture.
    Rodrigo Teruo Tomita and Cecília Mary Fischer Rubira.
    September 2005. In Portuguese, 152 pages.

    Summary: The construction of software through the planned integration of reusable components, known as component-based development (DBC), has gained wide acceptance for the development of large and complex information systems. The software architecture approach is complementary to the DBC paradigm, with the responsibility for integrating the components in order to obtain the desired qualities for the final system. Therefore, the main DBC processes are also centered on the software architecture. Existing development environments generally support UML modeling and the implementation of object-oriented systems and component implementation. However, they do not integrate modeling of software and DBC architectures. This technical report describes the requirements for the Bellatrix environment, which is an extension of Eclipse to DBC, centered on the UML Components process and based on the software architecture. Such requirements are described in terms of use cases and user interface prototypes.

  • IC-05-22 pdf bib
    A component-based development process with component reuse.
    Patrick Henrique da Silva Brito, Maria Antonia Martins Barbosa, Paulo Asterio de Castro Guerra, and Cecília Mary Fischer Rubira.
    September 2005. In Portuguese, 25 pages.

    Summary: The current software market is increasingly subject to strict deadlines and costs with high quality requirements. Because it streamlines development and makes it less costly in the long run, component-based development (DBC) has been widely adopted. The objective of this work is to propose a high quality DBC process, which maximizes the reuse of finished components during development. An important feature of the process is the fact that it is generic, in the sense that it consists of general activities, common to most current DBC methodologies. With this, it is intended that it is easily adaptable to the various processes used, which facilitates its practical use. In addition, the proposed method was adapted to the UML Components process.

  • IC-05-21 pdf bib
    A new framework to combine descriptors for content-based image retrieval.
    Ricardo da S. Torres, Alexandre X. Falcão, Marcos André Gonçalves, Baoping Zhang, Weiguo Fan, and Edward A. Fox.
    September 2005. In English, 15 pages.

    Abstract: Methods that combine image database descriptors have strong influence on the effectiveness of content-based image retrieval (CBIR) systems. Although there are many combination functions described in the image processing literature, empirical evaluation studies have shown that those functions do not perform consistently well across different contexts (queries, image collections, users). Moreover, it is often very difficult for human beings to identify optimal combination functions for a particular application. In this paper, we propose a novel framework using Genetic Programming to combine image database descriptors for CBIR. Our framework is validated through several experiments involving two image databases and a specific domain, where the images are retrieved based on the shape of their objects.

  • IC-05-20 pdf bib
    A deformable model for fast visualization of Euclidean isosurfaces of the brain.
    FPG Bergo, FF de Goes, SK Goldenstein, and AX Falcão.
    September 2005. In English, 13 pages.

    Abstract: Curvilinear reformatting of MR images has been of great assistance to the diagnosis of dysplastic lesions in epilepsy patients. A recent improvement is its complete automation using Euclidean isosurfaces obtained from an `` envelope '' of the brain. Each isosurface consists of points with the same Euclidean distance from the envelope. The 3D visualization of these isosurfaces is usually computed using voxel projection and the voxel intensities as texture map. In this paper, we introduce a deformable model which combines a polygonal mesh and a graph, both obtained from the envelope. Our data representation can quickly create meshes for isosurfaces at arbitrary depths from the envelope. This representation fits well to modern GPUs, leading to high interactivity with the hardware currently available. We evaluate the visualization performance of our new deformable model side by side to an efficient implementation of voxel projection.

  • IC-05-19 pdf bib
    Automatic object detection by tree pruning.
    AX Falcão, PAV Miranda, and FPG Bergo.
    September 2005. In English, 24 pages.

    Abstract: The Image Foresting Transform (IFT) has been presented for the design of image processing operators based on connectivity. The IFT reduces image processing problems into a minimum-cost path forest problem in a graph derived from the image. In this paper we propose a new image operator, which solves segmentation by pruning trees of the forest. An IFT is applied to create a minimum-cost path forest whose roots are seed pixels, selected inside a desired object. In this forest, the background consists of a few subtrees rooted at pixels (leaking pointsq) on the object's boundary. The leaking pixels are identified and their subtrees are eliminated, such that the remaining forest defines the object. Tree pruning reduces image segmentation to the choice of a few pixels in the image, favoring solutions for automatic object detection. We present a user-friendly way of identifying leaking pixels and give solutions for their automatic detection. Since automatic seed selection may be different for each application, we evaluate automatic segmentation with tree pruning in three situations: labeling of multiple objects with similar textures, 3D object definition, and shape-based object detection. The results indicate that tree pruning is a promising approach to investigate automatic image segmentation.

  • IC-05-18 pdf bib
    Object definition by kappa-connected components.
    AX Falcão, PAV Miranda, A. Rocha, and FPG Bergo.
    September 2005. In English, 20 pages.

    Abstract: The notion of `` strength of connectedness '' between pixels has been successfully used in image segmentation. We present extensions to these works, which can considerably improve the efficiency of object definition tasks. A set of pixels is said a kappa-connected component with respect to a seed pixel, when the strength of connectedness of any pixel in that set with respect to the seed is higher than or equal to a threshold. We discuss two approaches that define objects based on kappa-connected components with respect to a given seed set: with and without competition among seeds. While the previous approaches either takes on the competition with a single threshold for all seeds or eliminate the threshold for seed competition, we show that seeds with different thresholds can improve segmentation in both paradigms. We also propose automatic and user-friendly interactive methods to determining the thresholds. The proposed methods are presented in the framework of the image foresting transform, which naturally leads to efficient and correct graph algorithms. The improvements are demonstrated through several segmentation experiments involving medical images.

  • IC-05-17 pdf bib
    Enabling high-level switching activity estimation using systemc.
    F. Klein, R. Azevedo, and G. Araujo.
    August 2005. In English, 12 pages.

    Abstract: This paper presents an alternative methodology to a traditional high-level power estimation flow, that enables the fast gathering of switching activity from SystemC RTL descriptions. The proposed methodology requires minimum effort from the designer, while reducing the steps necessary to obtain switching activity information, and requires only a C ++ compiler. The resulting activity is very close to the results obtained using a commercial tool, which has a larger flow with several steps. We also show that our approach overcomes some drawbacks in the commercial tool.

  • IC-05-16 pdf bib
    Online class constraint bin packing.
    E. C. Xavier and F. K. Miyazawa.
    July 2005. In English, 9 pages.

    Abstract: In this paper we present approximation results for the online class constrained bin packing problem. In this problem we are given bins of capacity $ $ 1 With $ C $ compartments, and $ n $ items of $ Q $ different classes, each item $ i \ in \ {1, \ ldots, n \} $ with class $ c (i) $ and size $ s (i) $. The problem consists of pack the items into bins in an online way, where each bin contains at most $ C $ different classes and has total items size at most $ $ 1. We show that the bounded space version of this problem does not have an algorithm with constant competitive ratio. If each item have size at least $ eps <1 $, we show that the problem does not admit an algorithm with competitive ratio better than $ O (1 / C \; eps) $. In the unbounded case we show that the First-Fit algorithm has competitive ratio in $ [2.7, 3] $ and we present an algorithm with competitive ratio in $ [2.66, 2.75] $.

  • IC-05-15 pdf bib
    An APTAS for the variable-sized bin packing problem with color constraints.
    E. C. Xavier and F. K. Miyazawa.
    July 2005. In English, 7 pages.

    Abstract: We consider the Variable-Sized Bin Packing Problem With Color Constraints (VBPCC). In this problem we have an unbounded number of bins, each one with size in $ \ {w_1, \ ldots, w_k \} $, a list of items $ L $, each item $ e \ in L $ with size $ s_e $ and color $ c_e $. The objective is to pack the items of $ L $ into bins, such that no bin has items of more than $ p $ colors, the total size of items packed in a bin is no more than the bin size and the total size of used bins is minimized. We present an asymptotic polynomial time approximation scheme when the number of different item colors is bounded by a constant $ C $.

  • IC-05-14 pdf bib
    How to build a brace.
    Marcelo H. de Carvalho, Cláudio L. Lucchesi, and U. S. R. Murty.
    June 2005. In English, 25 pages.

    Abstract: We prove in this paper that every brace may be obtained, by means of edge additions and suitable vertex-splittings, from the two basic braces $ K_2 $ and $ C_4 $. McCuaig showed that this may be achieved, staying within the realm of simple graphs, starting with three infinite families of braces. While our theorem may be deduced from McCuaig's, our proof is simpler.

  • IC-05-13 pdf bib
    The total chromatic number of some bipartite graphs.
    C. N. Campos and C. P. de Mello.
    June 2005. In English, 11 pages.

    Abstract: the total chromatic number $ nct (G) $ is the least number of colors needed to color the vertices and edges of a graph $ G $ such that no incident or adjacent elements (vertices or edges) receive the same color. This work determines the total chromatic number of grids, particular cases of partial grids, near-ladders, and of $ k $-dimensional cubes.

  • IC-05-12 pdf bib
    Electronic contracts for business process management systems.
    Marcelo Fantinato, Maria Beatriz Felgar de Toledo, and Itana Maria de Souza Gimenes.
    June 2005. In English, 36 pages.

    Summary: Electronic contracts have played a key role in the life cycle of interorganizational business processes. Several works have been carried out with the objective of defining systematic techniques that allow the elaboration and maintenance of electronic contracts in the context of business process management systems. This technical report presents an overview of the main concepts and some of the work carried out in this area.

  • IC-05-11 pdf bib
    A model to support formative assessment for distance learning environments.
    Joice Lee Otsuka and Heloísa Vieira da Rocha.
    June 2005. In English, 31 pages.

    Summary: This technical report presents a support model for formative assessment for Distance Learning Environments (or Learning Management Systems). This model is based on the recommendations for a more formative assessment that are presented in the studies by Hadji [2001]. The proposed model seeks to support the mapping of these recommendations for the scope of distance education, considering the formative assessment of collaborative and constructionist learning activities. The model explores computational technologies to provide more effective support for formative assessment at two ends: on the one hand, it is expected to facilitate the planning of learning activities to be evaluated, as well as the registration of regulations provided by trainers for participation in these activities. On the other hand, it is expected to reduce the amount of information to be analyzed, assisting the trainer in the retrieval and analysis of quantitative and qualitative information relevant to the participation regulations, according to the criteria defined in the evaluation planning of each learning activity. .

    Abstract: This technical report presents a Formative Assessment Support Model for Learning Management Systems. This model is based on Hadji '[2001] recommendations to a more formative assessment. The proposed model aims to support the mapping of these recommendations to the distance education domain, considering the formative assessment of collaborative and constructionist learning activities. The model explores computational technologies to provide a more effective support to formative assessment on two complementary ways: (1) by supporting the planning of learning activities to be assessed, as well as the support to the educator regulation of participations on these planned assessment activities; (2) by reducing the amount of information to be analyzed, helping the educator on the recovery and analysis of relevant information for participation regulation, according to the criteria defined at each learning activity assessment planning.

  • IC-05-10 pdf bib
    Algebraic formalism for genome rearrangements (part 1).
    Cleber Mira and João Meidanis.
    June 2005. In English, 23 pages.

    Summary: One of the most important techniques for analyzing different genomes is comparing the order of their gene blocks. Genome outbreaks are mutational events that move large blocks of genes within the genome. We are interested in finding a sequence of rearrangement events such as reversals and transpositions that transform one genome into another.

    This article is composed of two parts: In the first part, we present the classic contributions to the problem of genome rearrangement and discuss some of its characteristics. In the second part, we rewrite some important problems and results involving rearrangement events and present a new algebraic formalism based on the work of Meidanis and Dias. We proved the equivalence between problems and some concepts of the classical theory and those of the new algebraic theory. The purpose of this text is to convince the reader that the algebraic model for genome rearrangement problems is more adequate and complete since it extends the classic results and allows us to model not only rearrangement events but also recombination events.

    Abstract: One of the most important techniques of analysis of distinct genome sequences is the comparison of the order of their blocks of genes. Genome Rearrangements are mutational events that move large blocks of genes in a genome. We are interested in finding a minimum sequence of rearrangement events such as reversals and transpositions that transform one genome into another.

    This work is composed by two parts. In the first part, we present the classical contributions for the genome rearrangement problem and discuss some of their characteristics. In the second part, we restate some problems and important results involving rearrangement events and present a new algebraic formalism based on Meidanis and Dias. We provide the equivalence between the problems and some concepts of the classical theory and the new algebraic theory. The last objective of this work is to convince the reader that the algebraic model for genome rearrangement problems is more appropriate and complete in the sense that it extends the classical results and allow us to model not only rearrangement events but also recombination events.

  • IC-05-09 pdf bib
    New EST trimming strategy.
    C. Baudet and Z. Dias.
    May 2005. In English, 12 pages.

    Abstract: Trimming procedures are an important part of the sequence analysis pipeline in an EST Sequencing Project. In general, trimming is done in several phases, each one detecting and removing some kind of undesirable artifact, such as low quality sequence, vectors or adapters sequence, and contamination. However, this strategy often results in a phase being unable to recognize its target because part of it was removed during a previous phase. To remedy this drawback, we propose a new strategy, where each phase detects but does not remove its target, leaving this decision to a post processing step occurring after all phases. Our tests show that this strategy can significantly improve the detection of artifacts.

  • IC-05-08 pdf bib
    Automatic extraction of euclidean isosurfaces for visualization of dysplastic lesions from brain MRI.
    F. P. G. Bergo, A. X. Falcão, MA Montenegro, and F. Cendes.
    May 2005. In English, 9 pages.

    Abstract: Curvilinear reformatting of MR images has been very useful in epilepsy research for detecting dysplastic lesions in the human brain. However, the method requires careful manual delineation and introduces artifacts. We present a fully automated approach to extract an `` envelope '' of the brain and obtain surfaces of the same Euclidean distance to the envelope. The visualization of the voxel intensities on these isosurfaces substitute the previous procedure, eliminating its artifacts. The new approach was successfully evaluated over 40 controls and 10 patients with dysplastic lesions.

  • IC-05-07 pdf bib
    Optimal and practical WAB-based consensus algorithms.
    Lasaro Camargos, Fernando Pedone, and Edmundo R. M. Madeira.
    April 2005. In English, 17 pages.

    Abstract: In this paper we introduce two new WAB-based consensus algorithms for the crash-recovery model. The first one, B * -Consensus, is resilient to up to $ f permanent faults, and can solve consensus in three communication steps. R * -Consensus, our second algorithm, is $ f resilient, and can solve consensus in two communication steps. These algorithms are optimal with respect to the time complexity versus resilience tradeoff. We provide analytical and experimental evaluation of our algorithms, and compare them with Paxos, an optimal leader-election based consensus algorithm for the crash-recovery model.

  • IC-05-06 pdf bib
    Service-based business process management system architectures.
    Marcelo Fantinato, Maria Beatriz Felgar de Toledo, and Itana Maria de Souza Gimenes.
    April 2005. In English, 39 pages.

    Summary: Business Process Management Systems are systems that offer support for processes involving cooperation between organizations that seek to achieve a common business objective. These systems must provide a wide range of functionality for creating, executing and monitoring business processes. To facilitate the integration between applications distributed among organizations, the computational operations that must be performed by the process activities are encapsulated within electronic services, including Web services. This report presents an overview of some architecture Business processes based on electronic services, including some systems based specifically on web services.

  • IC-05-05 pdf bib
    A web-based experiment on dialogue summarization.
    Norton Trevisan Roman, Paul Piwek, and Ariadne Maria Brito Rizzoni Carvalho.
    March 2005. In English, 16 pages.

    Summary: This report describes the technical details of a network experiment carried out at the State University of Campinas, SP, to study the summary of a set of dialogues. Our intention is to present the strategy followed when building the website of the experiment, as well as the statistical and technical issues involved. In this report we do not present the results of the experiment.

    Abstract: This document describes technical details of a web experiment carried out at the State University of Campinas, Brazil, on summary of a set of dialogues. Our intention here is to present the strategy we followed to build the experiment's website, as well as the statistical issues and the technicalities involved. In this report, we do not present the experiment's results.

  • IC-05-04 pdf bib
    Formal verification and parameters synthesis for hybrid real systems.
    Adilson Luiz Bonifácio and Arnaldo Vieira Moura.
    March 2005. In English, 42 pages.

    Abstract: A hybrid system is given by a set of real-time reactive components, whose dynamic behavior results from the interplay of continuous and discrete events. Hybrid system have been modeled using formal methods and results have been obtained using the accompanying computational tools. Some such formal methods and tools are briefly presented, emphasizing their different characteristics. The Hybrid Automata formalism is one among such methods. A hybrid automaton is a finite state automaton where each state is extended to contain a description of a system dynamic profile. Transitions between states model a change in the system dynamics, triggered by discrete events. In this work, hybrid automata are used to model realistic hybrid systems stemming from segments of a subway mesh and parts of an air traffic control system. The models constructed are verified in this way using the support of computational tools. Moreover, some important model parameters are synthesized using the same tools. The verification certified that the systems operate safely. The synthesis sessions arrived at tighter values ​​for some operational system parameters, while still guaranteeing a safe operation.

  • IC-05-03 pdf bib
    Basic aspects of clustering: concepts and techniques.
    Ricardo Luís Lachi and Heloísa Vieira da Rocha.
    February 2005. In Portuguese, 25 pages.

    Summary: Clustering is a way of organizing data by grouping them into sets, based on the greater similarity between the data of the same set than that of another, based on some predetermined criteria. This report presents the general aspects that must be observed when trying to apply some clustering technique to solve a problem. The general aspects presented are: definition of the representation form of the data set to be grouped, definition of an adequate measure of similarity between the data and definition of which clustering technique to use for the construction of the clusters. In addition, a large number of bibliographic references are available, allowing the reader to deepen all the topics present in the text.

    Abstract: Clustering is a kind of data organization that groups data into sets where elements of a set are more similar to each other than are similar to the elements of another set, according to some criteria. In this report general aspects are presented that should be observed when it is intended to apply some clustering technique to solve a problem. The general aspects presented are: definition of the data representation type among data to be grouped, definition of the most suitable similarity measure among data and definition of which clustering technique to use to build the clusters. Besides that, a lot of bibliographic references are available to readers in order to make possible deep studies about all topics presented in the text.

  • IC-05-02 pdf bib
    Using genetic algorithms for the dynamic job shop scheduling problem with processing time uncertainties.
    Gregório Baggio Tramontina and Jacques Wainer.
    February 2005. In English, 15 pages.

    Abstract: A possible solution to the job shop scheduling problem is using genetic algorithms. Although research on the topic can be considered evolved, it mainly concentrates on deterministic static problems. This work reports an experience on using genetic algorithms with the dynamic job shop problem with processing time uncertainties, based on two approaches: the guess and solve, and the decomposition of dynamic job shops into static instances. The guess and solve consists of making a controlled guess on the processing time of the jobs and solving the resulting deterministic scheduling problem with a suitable technique, in this report the genetic algorithms, and the decomposition treats the dynamic job shop as a series of static problems . Processing time uncertainties prevent the direct use of the decomposition approach, so an adjustment to this technique is also proposed. Simulation results show that the guess and solve with genetic algorithms together with the proposed adjustment is an adequate way to handle the dynamic job shop with processing time uncertainties.

  • IC-05-01 pdf bib
    Constraint programmin and GRASP approaches to schedule oil well drillings.
    Romulo Albuquerque Pereira, Arnaldo Vieira Moura, and Cid Carvalho de Souza.
    January 2005. In English, 26 pages.

    Abstract: In the process of drilling oil wells, it is necessary to schedule resources, such as oil derricks and boats, in order to perform development activities that prepare promissing wells for oil extraction. The scheduling of such activities must satisfy a variety of constraints and must attain some objective criteria, such as maximizing oil production. In this paper, we discuss a greedy randomized adaptive search procedure (GRASP) algorithm for the scheduling of oil well drilling activities. We describe in detail our GRASP implementation and compare the results obtained with results derived from a well accepted constraint programming implementation. Computational experience on real instances of the problem indicates that the GRASP implementation is competitive, outperforming the contraint programming implementation.


  • Instituto de Computação :: State University of Campinas
    PO Box 6176 • Av. Albert Einstein, 1251 - Cidade Universitária • CEP 13083-970 • Campinas / SP - Brazil • Phone: [19] 3521-5838