|
|
Issue
1, Volume 7, January 2008
Title of the Paper:
Constraint Satisfaction Problem Using Modified Branch and Bound Algorithm
DOWNLOAD FULL PDF
Authors:
Azlinah Mohamed, Marina Yusoff, Itaza Afiani Mohtar, Sofianita Mutalib,
Shuzlina Abdul Rahman
Abstract: A constraint satisfaction problem (CSP) involves assigning possible
values to a set of variables without defying any constraints. There are
various techniques available to solve or give partial solution to CSP. This
paper presents a modification of branch and bound algorithm, which is used to
solve a constraint satisfaction problem in map colouring problem. There are
two constraints involved which are only three colours are allowed to be used
and adjacent regions in the map must not be of the same colour. The modified
branch and bound algorithm uses back jumping when it encounters a dead-end in
the search. Static variable ordering was also applied to aid the searching
process. The modified branch and bound algorithm shows a better result in
terms of the number of nodes instantiated and the reduced number of
backtracking at dead ends. The result illustrated that the modified branch and
bound algorithm with the use of variable ordering technique is better if
compared to backjumping. Thus, it is concluded that the modified branch and
bound algorithm would improve constraint satisfaction problem.
Keywords: Backjumping, Branch and Bound Algorithm, Constraint Satisfaction
Problem, and Static Variable Ordering
Comments, Questions, Discussion ...
Title of the Paper: An
Application of Type-2 Fuzzy Notions in Website Structures Selection: Utilizing
Extended TOPSIS Method
DOWNLOAD FULL PDF
Authors:
Hamed Qahri Saremi, Gholam Ali Montazer
Abstract: Giving more effectiveness to e-commerce sites increases customer
satisfaction as visitors can navigate the website easier and find their
targets in less time and cost. With the development of web content, the
structure of a website becomes more complex and critical to both web designers
and users in a way that has made the prioritizing of various options of
website structure a pivotal decision making problem incorporating large
uncertainty in judgment. Finding a response to such a need, TOPSIS, a
conventional MADM technique, had been a mere remedy for years, solving the
problems with a more or less adequate accuracy. In an aspiring step toward an
improvement in this method, Fuzzy TOPSIS method, a combination of ordinary
TOPSIS method and Fuzzy theory, could heal some of shortcomings of
uncertainties and ordinary TOPSIS in decision making. However there are still
lots of occasions in which decision making is faced with lots of shadowiness
making the Fuzzy TOPSIS method not sufficiently receptive. As a sensible
response to this drawback, in this paper, we utilize a brand-new extension to
TOPSIS and Fuzzy TOPSIS methods, based on type-2 fuzzy notions with ability to
cope with type-2 fuzzy environment and data incorporating much more fuzziness
in decision making. We apply this method to a vague case of the real world and
discuss its results against the other previously-developed TOPSIS methods.
Keywords:
Decision Making; Website Structure; TOPSIS; Type-2 Fuzzy Sets; Interval Valued
Fuzzy Sets; MADM; Fuzzy TOPSIS; IVF-TOPSIS
Comments, Questions, Discussion ...
Title of the Paper: A
Novel Robust Watermarking Technique Using IntDCT Based AC Prediction
DOWNLOAD FULL PDF
Authors:
Kuo-Ming Hung
Abstract: Because of the blocking artifacts resulted from 8x8 Discrete Cosine
Transform (DCT) , most watermarking technologies have been proposed using DCT
whose image quality was not very good. In 1990, Gonzales et al. described a
technique which predicts a few low frequency AC coefficients. The AC predictor
uses the dequantized DC values of a 3x3 neighborhood of 8x8 blocks to predict
the AC values in the center block. Wang proposed a data hiding scheme using
the AC prediction technology in 2005. But it is unable to be suitable for the
image of all types to predict AC coefficients accurately. We propose a new
watermarking system by using the technologies of 4x4 integer DCT transform and
adaptive AC estimation. We use 4x4 INTDCT transform to reduce blocking
artifacts caused from 8x8 DCT transform and improve Imperceptibility and
watermark capacity greatly. Moreover, we utilize AC prediction value as
error-checking code to enhance robustness of watermark.
Keywords:
Watermark; DCT; IntDCT; AC prediction; H.264
Comments, Questions, Discussion ...
Title of the Paper:
Implementation Feasibility of Convex Recursive Deletion Regions Using
Multi-Layer Perceptrons
DOWNLOAD FULL PDF
Authors:
Che-Chern Lin
Abstract: A constructive algorithm to implement convex recursive deletion
regions via two-layer perceptrons has been presented in a recent study. In the
algorithm, the absolute values of the weights become larger and larger when
the number of nested layers of a convex recursive deletion region increases.
In addition, the absolute values of the weights are determined according to
the complexity of the structure of the convex recursive deletion region. More
complicated convex recursive deletion regions result in larger values of
weights. Besides, the constructive procedure is needed to get the parameters
(weights and thresholds) for the neural networks. In this paper, we propose a
simple three-layer network structure to implement the convex recursive
deletion regions in which all weights of the second and third layers are all
1’s and the thresholds for the nodes in the second layer are pre-determined
according to the structures of the convex recursive deletion regions. This
paper also provides the activation function for the output node. In brief, all
of parameters (weights and activation functions) in the proposed structure are
pre-determined and no constructive algorithm is needed for solving the convex
recursive deletion region problems. We prove the feasibility of the proposed
structure and give an illustrative example to demonstrate how the proposed
structure implements the convex recursive deletion regions. Finally, we
provide the conceptual diagram of the hardware implementation of the proposed
network structure.
Keywords:
Multi-layer perceptrons, nested decision region, convex recursive deletion
region, hardware implementation.
Comments, Questions, Discussion ...
Title of the Paper:
Construction of Virtual Backbone on Growth-Bounded Graph with Variable
Transmission Range
DOWNLOAD FULL PDF
Authors:
Yanjing Sun, Xiangping Gu, Jiansheng Qian
Abstract: Virtual backbone has been used extensively in various aspects for
wireless ad hoc or sensor networks recently. We propose an approximation
solution to construct a virtual backbone based on a more generalized and
realistic model of polynomial bounded growth. A localized distributed
algorithm of MCDS_GBG for computing a Minimum Connected Dominating Sets as
backbone in the growth-bounded graph is presented. This approach consists of
three stages: firstly construct an MIS by network decomposition scheme;
secondly compute a minimum dominating set in 2-separated collection with r
transmission range and finally use Marking process and ruling k to reduce the
virtual backbone with 3r transmission range. The computed Connected Dominating
Set guarantees a constant stretch factor on the length of a shortest path and
induces a subgraph of constant degree while the nodes only require direct
neighborhood information. The efficiency of our approach is confirmed through
both theoretical analysis and comparison study.
Keywords:
Virtual backbone; Growth-bounded graph; Connected dominating sets; Maximal
independent sets; Wireless ad hoc sensor network; Network decomposition
Comments, Questions, Discussion ...
Issue
2, Volume 7, February 2008
Title of the Paper: An
Algorithm Based on Core Characteristic Extraction of Watermelon Seeds by
Automated Separating System
DOWNLOAD FULL PDF
Authors:
Yong Sun, Yun Bai, Lihong Gang, Qiangguo Pu, Nikos Mastorakis
Abstract: The algorithm of extracting a series of characteristic values,
taking the flat objects, seeds, as the experiment is applied into the
auto-distinguishing seeds system software successfully. It fulfills the blank
in this area. It shows it is practical for the machine to distinguish the flat
objects with the equipment visual. It designs and realizes the new algorithm
of the length of orthogonal long and minor axis, the degree of connecting
superficial pattern and the area; brings forward the concept of the coverage
degree of black region and the way to distinguish different colorful objects
with the distribution difference of histogram gradation. The system software
has a very bright future. It is meaningful to the distinguishing objects
areas.
Keywords:
long and short axis, the degree of connecting surface patterns, the coverage
degree of black region, core characteristic extraction, watermelon seeds
Comments, Questions, Discussion ...
Title of the Paper: Fast
Pre-authentication with Minimized Overhead and High Security for WLAN Handoff
DOWNLOAD FULL PDF
Authors:
Hung-Yu Chien, Tzu-Hang Hsu, Yuan-Liang Tang
Abstract: User mobility in WLANs becomes more and more popular because of wide
deployment of WLANs and numerous applications on it. Some of these
applications, for example multimedia applications, require fast handoffs among
access points to maintain the quality of service. In order to support
multimedia applications for roaming users, IEEE 802.11i defines
pre-authentication to reduce the re-authentication delay. The primary drawback
of IEEE 802.11i pre-authentication is that the full 802.1x/EAP authentication
consumes too much overhead. In this paper, we propose a new fast
pre-authentication scheme that greatly improves the efficiency and achieves
high security level.
Keywords:
Pre-authentication, fast handoff, wireless security, IEEE 802.11i.
Comments, Questions, Discussion ...
Title of the Paper:
Fractional Fourier Transform Based Key Exchange for Optical Asymmetric Key
Cryptography
DOWNLOAD FULL PDF
Authors:
Aloka Sinha
Abstract: Recently several optical encryption techniques have been proposed
for two-dimensional data. These techniques use random phase masks, jigsaw
transforms, digital signatures, and linear transforms like the Fourier
transforms, the fractional Fourier transform and the Fresnel transform. The
strength of these encryption techniques is dependent on the size of the key
but is strongly limited by the security linked with the exchange of the secret
key. We propose a new technique, based on the Diffie-Hellman protocol, in
which the key can be exchanged with high security. The Diffie-Hellman protocol
allows two users to exchange a secret key over an insecure channel without any
prior secrets. Fractional Fourier transforms have been used for the secure key
transfer. Results of computer simulation are presented to verify the proposed
idea and analyse the robustness of the proposed technique.
Keywords:
Fractional Fourier Transform, Optical encryption, Public key encryption,
Diffie-Hellman protocol, Fourier Transform, cryptography
Comments, Questions, Discussion ...
Title of the Paper:
Increase The Efficiency of English-Chinese Sentence Alignment: Target
Range Restriction and Empirical
Selection of Stop Words
DOWNLOAD FULL PDF
Authors:
Wing-Kwong Wong, Hsi-Hsun Yang, Wei-Lung Shen, Sheng-Kai Yin, Sheng-Cheng Hsu
Abstract: In this paper, we use a lexical method to do sentence alignment for
an English-Chinese corpus. Past research shows that alignment using a
dictionary involves a lot of word matching and dictionary look ups. To address
these two issues, we first restrict the range of candidate target sentences,
based on the location of the source sentence relative to the beginning of the
text. Moreover, careful empirical selection of stop words, based on word
frequencies in the source text, helps to reduce the number of dictionary look
ups. Experimental results show that the amount of word matching can be cut
down by 75% and that of dictionary look ups by as much as 43% without
sacrificing precision and recall. Another experiment was also done with twenty
New York Times articles with 598 sentences and 18395 words. The resulted
precision is 95.6% and the recall is 93.8%. Among all predicted alignment, 86%
of the alignment is 1:1 (one source sentence to one target sentence), 8% is
1:2, and 6% is 2:1. Further analysis shows that most errors occur in
alignments of types 1:2 and 2:1. Future work should focus on problems with
these two alignment types.
Keywords:
Sentence alignment, lexical method, statistical method, English-Chinese
corpus, stop words, target range.
Comments, Questions, Discussion ...
Title of the Paper:
TEMPLUM: A Process Adapted Numerical Simulation Code for The 3D Predictive
Assessment of Laser Surface Heat Treatments in Planar Geometry
DOWNLOAD FULL PDF
Authors:
A. Garcia-Beltran, J. L. Ocana, C. L. Molpeceres
Abstract: A process adapted numerical simulation code for the 3D predictive
assessment of laser heat treatment of materials has been developed. Primarily
intended for the analysis of the laser transformation hardening of steels, the
code has been successfully applied for the predictive characterization of
other metallic and non metallic materials posing specific difficulties from
the numerical point of view according to their extreme thermal and absorption
properties. Initially based on a conventional FEM calculational structure, the
developed code (TEMPLUM) reveals itself as an extremely useful prediction tool
with specific process adapted features (not usually available in FEM heat
transfer codes) in the field of laser heat treatment applications.
Keywords:
Numerical analysis; Finite element; Modeling; Simulation; Heat conduction;
Laser surface treatments; Transformation hardening; Optical glass polishing.
Comments, Questions, Discussion ...
Issue
3, Volume 7, March 2008
Title of the Paper: An
Improved Nested Partitions Algorithm Based on Simulated Annealing in Complex
Decision Problem Optimization
DOWNLOAD FULL PDF
Authors:
Chang-Rui Yu, Yan Luo
Abstract: This paper introduces the main ideas of the nested partitions (NP)
method, analyses its efficiency theoretically and proposes the way to improve
the optimization efficiency of the algorithm. Then the paper introduces the
simulated annealing(SA) algorithm and incorporates the ideas of SA into two of
the arithmetic operators of NP algorithm to form the combined NP/SA algorithm.
Moreover, the paper presents the explicit optimization procedure of the
combined algorithm NP/SA and explains the feasibility and superiority of it.
The NP/SA algorithm adopts the global optimization ability of NP algorithm and
the local search ability of SA algorithm so that it improves the optimization
efficiency and the convergence rate. This paper also illustrates the NP/SA
algorithm through an optimization example.
Keywords:
Nested partitions algorithm, Simulated annealing, Complex decision problem.
Comments, Questions, Discussion ...
Title of the Paper:
Faulty-Tolerant Algorithm for Mapping a Complete Binary Tree in an IEH
DOWNLOAD FULL PDF
Authors:
Shih-Jung Wu, Jen-Chih Lin, Huan-Chao Keh
Abstract: Different parallel architectures may require different algorithms to
make the existent algorithms on one architecture be easily transformed to or
implemented on another architecture. This paper proposes a novel algorithm for
embedding complete binary trees in a faulty Incrementally Extensible Hypercube
(IEH). Furthermore, to obtain the replaceable node of the faulty node,
2-expansion is permitted such that up to (n+1) faults can be tolerated with
dilation 3, congestion 1 and load 1. The presented embedding methods are
optimized mainly for balancing the processor loads, while minimizing dilation
and congestion as far as possible. According to the result, we can map the
parallel algorithms developed by the structure of complete binary tree in an
IEH. These methods of reconfiguring enable extremely high-speed parallel
computation.
Keywords:
Hypercube, Incrementally Extensible Hypercube, Complete binary tree,
Fault-Tolerance, Embedding
Comments, Questions, Discussion ...
Title of the Paper:
Height, Size Performance of Complete and Nearly Complete Binary Search Trees
in Dictionary Applications
DOWNLOAD FULL PDF
Authors:
Ahmed Tarek
Abstract: Trees are frequently used data structures for fast access to the
stored data. Data structures like arrays, vectors and linked lists are limited
by the trade-off between the ability to perform a fast search and the ability
to resize easily. Binary Search Trees are an alternative data structure that
is both dynamic in size and easily searchable. Now-a-days, more and more
people are getting interested in using electronic organizers and telephone
dictionaries avoiding the hard copy counter parts. In this paper, performance
of complete and nearly complete binary search trees are analyzed in terms of
the number of tree nodes and the tree heights. Analytical results are used
together with an electronic telephone dictionary for a medium sized
organization. It’s performance is evaluated in lieu of the real-world
applications. The concept of multiple keys in data structure literature is
relatively new, and was first introduced by the author. To determine the
dictionary performance, another algorithm for determining the internal and the
external path lengths is also discussed. New results on performance analysis
are presented. Using tree-sort, individual records inside the dictionary may
be displayed in ascending order.
Keywords:
Complete Binary Search Tree, Nearly Complete Binary Search Tree, Electronic
Telephone Dictionary, Performance
Analysis, Performance Measurement, Logarithmic Time Complexity.
Comments, Questions, Discussion ...
Title of the Paper:
Case-Oriented Alert Correlation
DOWNLOAD FULL PDF
Authors:
Jidong Long, Daniel G. Schwartz
Abstract: Correlating alerts is of importance for identifying complex attacks
and discarding false alerts. Most popular alert correlation approaches employ
some well-de
ned knowledge to uncover the connections among alerts. However, acquiring,
representing and justifying such knowledge has turned out to be a nontrivial
task. In this paper, we propose a novel method to work around these
difficulties by using case-based reasoning (CBR). In our application, a case,
constructed from training data, serves as an example of correlated alerts. It
consists of a pattern of alerts caused by an attack and the identity of the
attack. The runtime alert stream is then compared with each case, to see if
any subset of the runtime alerts are similar to the pattern in the case. The
process is reduced to a matching problem. Two kinds of matching methods were
explored. The latter is much more e
cient than the former. Our experiments with the DARPA Grand Challenge Problem
attack simulator have shown that both produce almost the same results and that
case-oriented alert correlation is e ective in detecting intrusions.
Keywords:
Alert Correlation, Case-Based Reasoning, Data Mining, Intrusion Detection
Title of the
Paper: CORAL - Online Monitoring in Distributed Applications: Issues and
Solutions
DOWNLOAD FULL PDF
Authors:
Ivan Zoraja, Ivan Zulim, Maja Stula
Abstract: In this paper we describe and evaluate issues that come up in the
development of online monitoring systems which connect software tools to a
running distributed application. Our primary intension was to elaborate how to
deal with complex middleware mechanisms that cater for the middleware
functionality in a way transparent to the users and tools. Our current
implementation, called Coral, manages DSM mechanisms that provide an
abstraction of shared memory on loosely coupled hardware, and allows multiple
tools to perform consistent yet efficient operations on the entities being
monitored. Since our primary design choice with Coral was portability we will
port Coral to distributed environments based on the SOA technology.
Keywords:
Online Monitoring, DSM, Tools, Process migration, Performance analysis,
Checkpointing
Title of the
Paper: Mining Strong Positive and Negative
Sequential Patterns
DOWNLOAD FULL PDF
Authors:
Nancy P. Lin, Hung-Jen Chen, Wei-Hua Hao, Hao-En Chueh, Chung-I Chang
Abstract: In data mining field, sequential pattern mining can be applied in
divers applications such as basket analysis, web access patterns analysis, and
quality control in manufactory engineering, etc. Many methods have been
proposed for mining sequential patterns. However, conventional methods only
consider the occurrences of itemsets in customer sequences. The sequential
patterns discovered by these methods are called as positive sequential
patterns, i.e., such sequential patterns only represent the occurrences of
itemsets. In practice, the absence of a frequent itemset in a sequence may
imply significant information. We call a sequential pattern as negative
sequential pattern, which also represents the absence of itemsets in a
sequence. The two major difficulties in mining sequential patterns, especially
negative ones, are that there may be huge number of candidates generated, and
most of them are meaningless. In this paper, we proposed a method for mining
strong positive and negative sequential patterns, called PNSPM. In our method,
the absences of itemsets are also considered. Besides, only sequences with
high degree of interestingness will be selected as strong sequential patterns.
An example was taken to illustrate the process of PNSPM. The result showed
that PNSPM could prune a lot of redundant candidates, and could extract
meaningful sequential patterns from a large number of frequent sequences.
Keywords:
Data mining, Itemset, Frequent sequence, Positive sequential pattern, Negative
sequential pattern, Strong sequential pattern
Title of the
Paper: A Deflected Grid-based Algorithm for
Clustering Analysis
DOWNLOAD FULL PDF
Authors:
Nancy P. Lin, Chung-I Chang, Hao-En Chueh, Hung-Jen Chen, Wei-Hua Hao
Abstract: The grid-based clustering algorithm, which partitions the data space
into a finite number of cells to form a grid structure and then performs all
clustering operations on this obtained grid structure, is an efficient
clustering algorithm, but its effect is seriously influenced by the size of
the cells. To cluster efficiently and simultaneously, to reduce the influences
of the size of the cells, a new grid-based clustering algorithm, called DGD,
is proposed in this paper. The main idea of DGD algorithm is to deflect the
original grid structure in each dimension of the data space after the clusters
generated from this original structure have been obtained. The deflected grid
structure can be considered a dynamic adjustment of the size of the original
cells, and thus, the clusters generated from this deflected grid structure can
be used to revise the originally obtained clusters. The experimental results
verify that, indeed, the effect of DGD algorithm is less influenced by the
size of the cells than other grid-based ones.
Keywords:
Data Mining, Clustering Algorithm, Grid-based Clustering, Significant Cell,
Grid Structure
Title of the
Paper: Fast Mining of Closed Sequential
Patterns
DOWNLOAD FULL PDF
Authors:
Nancy P. Lin, Wei-Hua Hao, Hung-Jen Chen, Hao-En Chueh, Chung-I Chang
Abstract: This paper propose a novel algorithm for mining closed frequent
sequences, a scalable, condensed and lossless structure of complete frequent
sequences that can be mined from a sequence database. This algorithm, FMCSP,
has applied several optimization methods, such as equivalence class, to
alleviate the needs of searching space and run time. In particular, since one
of the main issues in this type of algorithms is the redundant generation of
the closed sequences, hence, we propose an effective and memory saving
methods, different from previous works, does not require the complete set of
closed sequences to be residing in the memory.
Keywords:
data mining, sequential patterns mining, closed sequential patterns
Title of the
Paper: A Data Centered Approach for Cache
Partitioning in Embedded Real-Time Database System
DOWNLOAD FULL PDF
Authors:
Hu Wei, Chen Tianzhou, Shi Qingsong, Jiang Ning
Abstract: Embedded real-time databases become a basic part of the embedded
systems in many using environments. Caches are used for reducing the gap
between processor and off-chip memory. But caches introduce unpredictability
in general real-time systems. Although several cache partitioning approaches
have been purposed to tackle this problem, there is no scheme designed for
real-time database system up to now. In this paper, we present a data centered
cache partitioning approach that allows different tasks to have a shared
locking partition in cache. The hard real-time tasks will have their own
partitions and thus they can perform high predictability. At the same time, a
shared non-locking partition is reserved for the soft real-time tasks. In this
way we can target performance improvements based on the data that are
frequently used by many tasks in the system. Our experiment results show that
the miss rate can be reduced by about 10%~18% compared with that of a
statically partitioned cache and by about 24%~40% compared with a dynamic
cache using LRU replacement policy.
Keywords:
Data sharing, Cache partitioning, Embedded database, Real-time
Title of the
Paper: The Curling Vector Field Transform
of Gray-Scale Images: A Magneto-Static Inspired Approach
DOWNLOAD FULL PDF
Authors:
X. D. Zhuang, N. E. Mastorakis
Abstract: For image structure representation and feature extraction, the
curling vector field transform is proposed based on the magneto-static
analogy. The digital image is taken as the source of the vector field, and the
vector field transform of the image is presented imitating the form of the
physical magnetic field, which has a distinctive feature of rotating whorl
pattern. The simulation results indicate that the curling vector field can
represent the image’s structure feature, which can be applied in image
segmentation. The experimental results show that image segmentation can be
effectively implemented based on the image structure feature extracted by the
curling vector field transform.
Keywords:
vector field transform, curling vector field, image structure, image
segmentation
Issue 4, Volume 7, April 2008
Title of the
Paper: Electrocardiogram Compression and
Optimal ECG Filtering Algorithms
DOWNLOAD FULL PDF
Authors:
Mihaela Lascu, Dan Lascu
Abstract: In this paper novel compression techniques are developed for
portable heart-monitoring equipment that could also form the basis for more
intelligent diagnostic systems thanks to the way the compression algorithms
depend on signal classification. There are two main categories of compression
which are employed for electrocardiogram signals: lossless and lossy. Design
of an optimal Wiener filter is implemented to remove noise from a signal,
considering that the signal is statistically stationary and the noise is a
stationary random process that is statistically independent of the signal. Two
programs for compression and Wiener optimal filtering are realized in MATLAB.
The main idea of optimal filtering is to give bigger weight coefficients to
signal spectra parts where signal noise has less power and true signal
spectral components have bigger power. A Savitzky-Golay filtering is applied
to a noisy electrocardiogram and a comparison is done between the four methods
Wiener, Butterworth, Savitzky-Golay and synchronized averaging.
Keywords:
Electrocardiogram, Compression, Filtering, Matlab, Noise, Diagnostic.
Title of the
Paper: An Iterative Method for
Finite-Element Solutions of the Nonlinear Poisson-Boltzmann Equation
DOWNLOAD FULL PDF
Authors:
Ren-Chuen Chen
Abstract: A finite-element (FE) approach combined with an efficient iterative
method have been used to provide a numerical solution of the nonlinear
Poisson-Boltzmann equation. The iterative method solves the nonlinear
equations arising from the FE discretization procedure by a node-by-node
calculation. Moreover, some extensions called by Picard, Gauss-Seidel, and
successive overrelaxation (SOR) methods are also presented and analyzed for
the FE solution. The performances of the proposed methods are illustrated by
applying them to the problem of two identical colloidal particles in a
symmetric electrolyte. My numerical results are found in good agreement with
the previous published results. A comprehensive survey is also given for the
accuracy and efficiency of these methods.
Keywords:
finite-element method, Poisson-Boltzmann equation, colloidal particles
interaction
Title of the
Paper: NURBS Curve Shape Modification and
Fairness Evaluation
DOWNLOAD FULL PDF
Authors:
Tetsuzo Kuragano, Akira Yamaguchi
Abstract: For the purpose of evaluation, a NURBS curve is used, because it is
commonly used in the areas of CAD/CAM and Computer Graphics. A curve with a
monotone radius of curvature distribution is considered as a fair curve in the
area of Computer Aided Aesthetic Design (CAAD). But no official standards have
been established. Therefore, a criterion for a fair curve is proposed. A
quintic NURBS curve, the first derivative of a quintic NURBS curve, curvature
vector, curvature, and radius of curvature are expressed. The concept of
radius of curvature specification to modify the shape of a NURBS curve is
illustrated. The difference between the NURBS curve radius of curvature and
the specified radius of curvature is minimized by introducing the
least-squares method to modify the shape of the NURBS curve. As curve fairness
evaluation, radius of curvature distribution is used as an alternative
characteristic of a curve. Algebraic functions such as linear, quadratic,
cubic, quartic, quintic, and six degrees are applied to the radius of
curvature distribution of the designed curve to specify the radius of
curvature. Then, the shape of the curve is modified according to the specified
radius of curvature distribution. In this manner, six NURBS curves whose
radius of curvature are these algebraic functions are generated, and are
predefined. Using the correlation matching, the similarity is evaluated by
comparing the radius of curvature distribution of the designed curve with
those of six NURBS curves predefined. The highest similarity curve to the
designed curve among these predefined curves is selected. The similarity
evaluated of the selected curve is determined as fairness of the designed
curve.
Keywords:
curve shape modification, fair curve, radius of curvature specification,
correlation matching, fairness evaluation
Title of the
Paper: Automated Color Image Edge Detection
Using Improved PCNN Model
DOWNLOAD FULL PDF
Authors:
Liang Zhou, Yu Sun, Jianguo Zheng
Abstract: Recent researches indicate that pulse coupled neural network can be
used for image processing, such as image segmentation and edge detection
effectively. However, up to now it has mainly been used for the processing of
gray images or binary images, and the parameters of the network are always
adjusted and confirmed manually for different images, which impede PCNN’s
application in image processing. To solve these problems, based on the model
of Pulse Coupled Neural Network and the model of HIS, this paper bring forward
an improved PCNN model in the color image segmentation with the parameters
determined by images’ spatial and gray characteristics automatically at the
first, then use the above model to obtain the edge information. The experiment
results show the good effect of the new PCNN model.
Keywords:
pulse coupled neural network (PCNN), image processing, HIS, color image
segmentation, parameter determination, image edge detection, spatial
characteristics, gray characteristics
Title of the
Paper: Development of Specific Disease Data
Warehouse for Developing Content from General Guide for Hypertension
Screening, Referral and Follow Up
DOWNLOAD FULL PDF
Authors:
Teh Ying Wah, Ng Hooi Peng, Ching Sue Hok
Abstract: This paper proposes a method of developing specific disease data
warehouse such as hypertension data warehouse. Significant steps in developing
the data warehouse will be described especially data extraction,
transformation and loading. The purpose of developing this data warehouse is
to help/assist the specialist, health care team, pharmacists to figure out the
best and suitable strategies to be implemented during the screening, referral
and follow up process. As the data may come from various data sources mostly
different websites, the amount of time spent of this tasks is often
underestimated. Issues on how we crawl the data from various data sources and
store it into database will be discussed further.
Keywords:
Data warehouse, Hypertension, Screening, Referral, Follow up
Title of the
Paper: Dynamic Threshold Determination for
Stable Behavior Detection
DOWNLOAD FULL PDF
Authors:
Hiroyuki Yamahara, Fumiko Harada, Hideyuki Takada, Hiromitsu Shimakawa
Abstract: To provide services according to user behavior, parameters should be
adapted appropriately for the precise recognition of user behavior. In
particular, the threshold value which is used to create behavioral patterns
matched for behavior recognition impacts accuracy of behavior recognition.
Because the
threshold value is common to all users in the conventional model, the
threshold setting unsuitable for some users may cause low recognition rates.
In this paper, we propose a behavior detection method which detects high-level
user behaviors, such as “leaving home”. The proposed method achieves stable
behavior recognition regardless of users, by introducing a model which
dynamically determines the threshold value for individual user.
Keywords:
Threshold, Context, Behavior, Ambient, Proactive
Title of the
Paper: Automated Three Stage Red Lesions
Detection in Digital Color Fundus Images
DOWNLOAD FULL PDF
Authors:
C. Marino, E. Ares , M. G. Penedo, M. Ortega, N. Barreira, F. Gomez-Ulla
Abstract: The screening process is a very valuable method for prevention of
many pathologies including Diabetic retinopathy. Typically, a large amount of
images have to be analyzed as diabetic patients have both their eyes examined
at least once a year. To effectively manage all this information and the
workload it produces, automatic techniques for analyzing the images are
required. These techniques must be robust, sensitive and specific to be
implemented in real-life screening applications. In this work an algorithm for
the detection of red lesions in digital color fundus photographs is proposed.
The method performs in three stages: in the first stage points candidates to
be red lesions are obtained by using a set of correlation filters working in
different resolutions, allowing that way the detection of a wider set of
points. Then, in the second stage, a region growing segmentation process
rejects the points from the prior stage whose size does not fit in the red
lesion pattern. Finally, in the third stage three test are applied to the
output of the second stage: a shape test to remove non-circular areas, an
intensity test to remove that areas belonging to the fundus of the retina and
finally a test to remove the points which fall inside the vessels (only
lesions outside the vessels are considered). Evaluation was performed on a
test composed on images representative of those normally found in a screening
set. Moreover, comparison with manually-obtained results from clinical experts
are performed, to set the accuracy and reliability of the method.
Keywords:
Fundus, microaneurysm, red lesions, retina, screening, correlation.
Title of the
Paper:
Network Reliability Importance
Measures : Combinatorics and Monte Carlo based Computations
DOWNLOAD FULL PDF
Authors:
Ilya Gertsbakh, Yoseph Shpungin
Abstract: In this paper we focus on computational aspects of network
reliability importance measure evaluation. It is a well known fact that most
network reliability problems are NP-hard and therefore there is a significant
gap between theoretical analysis and the ability to compute different
reliability parameters for large or even moderate networks. In this paper we
present two very efficient combinatorial Monte Carlo
models for evaluating network reliability importance measures.
Keywords:
Network, Reliability, Importance Measure, Monte Carlo, Combinatorial Approach
Title of the
Paper: A Cognitive Tool to Support
Mathematical Communication in Fraction Word Problem Solving
DOWNLOAD FULL PDF
Authors:
Azlina Ahmad, Siti Salwah Salim, Roziati Zainuddin
Abstract: Word problem solving is one of the most challenging tasks in
mathematics for most students. It requires the solver to translate the problem
into the language of mathematics, where we use symbols for mathematical
operations and for numbers whether known or unknown. From a study conducted on
Malaysian school students, it was found that majority of them did not write
their solution to the word problem using correct mathematical language.
Intrapersonal and interpersonal communications are important in mathematics
learning especially in word problem solving. It is therefore the main aim of
this paper is to present a model that promotes the use of mathematical
language. The model is used as a basis in designing a computer-based learning
environment for word problem solving. The cognitive tool named MINDA which
incorporates several important necessary steps and activities was developed to
facilitate learning. From the experimental analysis conducted on using MINDA,
it was found that the mathematical communication and their word problem
solving achievement of students have improved.
Keywords:
word problem solving, cognitive tool, mathematical communication,
intrapersonal communication, interpersonal communication.
Title of the
Paper: Extending the Equivalent
Transformation Framework to Model Dynamic Interactive Systems
DOWNLOAD FULL PDF
Authors:
Courtney Powell, Kiyoshi Akama
Abstract: Conceptualizing, visualizing, analyzing, reasoning about and
implementing Dynamic Interactive Systems (DISs) are difficult and error-prone
activities. To conceptualize and reason about the sorts of properties expected
of any DIS, a formal framework that most naturally facilitates
conceptualization and modelling of DISs is essential. In this paper we propose
and explain why extending the Equivalent Transformation Framework to
conceptually model DISs satisfies this ideal. The benefits to be derived from
using this framework include a simplified and intuitive conceptualization
process, mathematically sound models, guaranteed system correctness, high
level abstraction, clarity, granular modularity, and an integrated framework
for reasoning about, manipulating, and optimizing the various aspects of DISs.
Keywords:
Conceptual Modelling, Dynamic Interactive Systems, Equivalent Transformation,
Correctness, Formal Methods.
Title of the
Paper: An Application of Data Mining
Technique in Developing Sizing System for Army Soldiers in Taiwan
DOWNLOAD FULL PDF
Authors:
Hai-Fen Lin, Chih-Hung Hsu, Mao-Jiun J. Wang, Yu-Cheng Lin
Abstract: In the field of garment manufacturing, the planning and control of
production and inventory are rather complicated procedures. This is the reason
that establishing standard sizing systems is necessary and important for
garment manufacturers in
Taiwan. As the standard sizing systems need
anthropometric data for reference, an anthropometric database for
Taiwan
army servicemen was first constructed for the purpose of simplifying the
entire process. Anthropometric data collected from scratch are used to
establish the sizing systems. The data mining method, which has been
extensively used in many fields, is applied in this project. Few researches
have been conducted in addressing the establishment of sizing systems. This
study aims to establish systems for determining the sizes of garments for army
personnel using the data mining technique. The newly developed sizing systems
can be adopted to better accurately predict the requirements of different
sizes of uniforms, and then to generate a practical production planning
procedure. This study found that, by applying data mining technique,
unnecessary inventory costs resulting from sizing mismatches that portend
large differences between the numbers of soldiers and of produced garments can
be significantly minimized.
Keywords:
Anthropometric data, Data mining, Decision tree, Sizing systems, Production
planning, Garment manufacturing
Title of the
Paper: QoS Integration of the Internet and
Wireless Sensor Networks
DOWNLOAD FULL PDF
Authors:
Weilian Su, Bassam Almaharmeh
Abstract: Recent developments in sensor networking for both military and
civilian applications emphasized the need for a reliable integration of sensor
networks with the Internet. For sensor networks deployed in various military
applications, it is important that collected information be delivered as fast
as possible with minimum delays. In this paper, an integration module is
proposed. The objective of the module is to provide preferential services for
high-priority traffic. The integration module is implemented and tested using
hardware equipments, such as Cisco routers and 10/100 Mbps switches. According
to the testbed measurements, the proposed integration module is able to adapt
to different traffic needs, thus, ensuring the QoS for different sensor
network applications.
Keywords:
Wireless Sensor Networks, QoS, Integration Module.
Title of the
Paper: Channel Propagation Measurement and
Simulation of MICAz mote
DOWNLOAD FULL PDF
Authors:
Weilian Su, Mohamad Alzaghal
Abstract: Wireless Sensor Networks (WSNs) is an important field of study as
more and more applications are enhancing daily life. The technology trend is
to achieve small-sized, cheap, and power efficient sensor nodes, which will
make the system reliable and efficient. The Crossbow Technologies MICAz mote
is an example used in this paper. Measurements of its propagation
characteristics in a realistic environment will help the deployment and
installation of these motes to form a WSN. The CST Microwave Studio is used to
build a simulation of the MICAz. The results and comparisons between empirical
and simulated data are intended to assist in the design, future studies and
deployment of WSNs in the real world.
Keywords:
Wireless Sensor Networks, Propagation Characteristics, MICAz Motes
Title of the
Paper: Preventing Conflict Situations
During Authorization
DOWNLOAD FULL PDF
Authors:
Slvia Encheva, Sharil Tumin
Abstract: Computer-based access control systems working with financial and
privacy issues are concerned with access control policies. Structuring
authorizations turns out to be of a key importance in a case of collaborating
organizations.
Keywords:
Computer-based access control systems
Title of the
Paper: E-Research Centre and E-Creative
Design new Trends for E-Activities Platform
DOWNLOAD FULL PDF
Authors:
Sorin Borza, Dan Pual Brindasu, Livia Beju, Marinela Inta
Abstract: In this paper the author introduces the concept of E-creative
design, which refers to the use of methods and techniques (modify for internet
application) for stimulating the individual and group creativity in a design
session using an e-research center for the researchers in the field of
mechanical engineering sciences as well as its development and integration in
the existing national and international infrastructure. The goal of this
center is to include as many researchers from our faculty initially, then from
our university, our geographic area, our country, from all over the world. To
this purpose, the center sets itself to promote and support new e-research
projects, to encourage the building of multidisciplinary groups that should
cooperate among themselves and share resources and infrastructure. One of this
is E-Creative Design. The E-Creative design methodology is presented first,
step by strep in the pseudo code language. In order to use creative methods on
internet, the morphological analysis and “San Francisco” creativity
methods were presented from this new perspective.Finally the main functions of
a research-design platform were elaborated. An example of E-creative session
on Internet in cutting tool area completes the presentation.
Keywords:
e-research, e-science, cyberstructure, midleware, virtual centre, e-design,
grid
Title of the
Paper: A Comprehensive Taxonomy of DDoS
Attacks and Defense Mechanism Applying in a Smart Classification
DOWNLOAD FULL PDF
Authors:
Abbass Asosheh, Naghmeh Ramezani
Abstract: A Distributed denial of service (DDoS) attack uses multiple machines
operating in concern to attack a network or site. It is the most important
security problem for IT managers. These attacks are very simple organized for
intruders and hence so disruptive. The detection and defense of this attack
has specific importance among network specialists. In this paper a new and
smart taxonomy of DDoS attack and defense mechanism will be introduced. The
attacks taxonomy is introduced using both known and potential attack
mechanisms. It comprises all types of attacks and provides a comprehensive
point of view for DDoS attacks. We introduce a useful tool that can be
employed to a sophisticated selection defense method for DDoS attacks.
Furthermore a smart taxonomy method of DDOS attacks will be proposed to help
selection an appropriate defense mechanism. This method uses some features of
DDOS attacks and classifies it to several clusters by Kmean algorithm and
labels each cluster with a defense mechanism. If an IDS detects a DDOS attack,
proposed system extract attack features and classify it by KNN
(K-Nearest-Neighbor) to determine the cluster in which it belongs to. The
defense mechanisms taxonomy is using the currently known approaches. Also the
comprehensive defense classification will help to find the appropriate
strategy to overcome the DDoS attack.
Keywords:
DDoS attack, Defense mechanism, Taxonomy, Detection, Smart Classification
Title of the
Paper: Parallelization of Prime Number
Generation using Message Passing Interface
DOWNLOAD FULL PDF
Authors:
Izzatdin Aziz, Nazleeni Haron, Low Tan Jung, Wan Rahaya Wan Dagang
Abstract: In this research, we propose a parallel processing algorithm that
runs on cluster architecture suitable for prime number generation. The
proposed approach was written using Message Passing Interface (MPI) and is
meant to decrease computational cost and accelerate the prime number
generation process. Several experimental results conducted using High
Performance Linpack (HPL) benchmark are presented to demonstrate the viability
of our work. The results suggest that the performance of our work is at par
with other parallel algorithms.
Keywords:
Prime number generation, parallel processing, cluster architecture, MPI,
primality test.
Title of the
Paper: Experiment Replication and
Meta-Analysis in Evaluation of Intelligent Tutoring System’s Effectiveness
DOWNLOAD FULL PDF
Authors:
Ani Grubisic, Slavomir Stankov, Branko Zitko
Abstract: This paper presents the methodology for conducting controlled
experiment replication, as well as, the results of a controlled experiment and
an internal replication that investigated the effectiveness of an intelligent
tutoring system. Since, there doesn’t seem to be a common ground on guidelines
for the replication of experiments in intelligent tutoring system’s
educational influence evaluation, this scientific method has just started to
be applied to this propulsive research field. We believe that every
effectiveness evaluation should be replicated at least in order to verify the
original results and to indicate an evaluated e-learning system’s advantages
or disadvantages. On the grounds of experiment replication, a meta-analysis
can be conducted in order to calculate overall intelligent tutoring system
effectiveness.
Keywords:
e-learning, intelligent tutoring systems, evaluation, effect size,
effectiveness, experiment, replication, meta-analysis
Title of the
Paper: Context-Aware Learning Path Planner
DOWNLOAD FULL PDF
Authors:
Maiga Chang, Alex Chang, Jia-Sheng Heh, Tzu-Chien Liu
Abstract: This paper develops a context-awareness learning path planner. The
learning path planner constructs suitable learning path for individual student
according to his/her misconceptions of the learning objects and the distances
in the real world. Beside the remedy learning path, the planner also provides
two guidance messages to students, the moving guidance message and the
learning guidance messages. The moving guidance messages are used to lead
students traveling from one learning spot to another. The learning guidance
messages are used to guide students observing the specific part of the
learning objects in order to clear their misconceptions. At the end of this
paper, an example is showing how the planner works for learning in a museum.
Keywords: Mobile Learning, Learning Path, Knowledge Structure,
Misconception, Situated Learning
Title of the
Paper: Assembly Time Minimization for an
Electronic Component Placement Machine
DOWNLOAD FULL PDF
Authors:
Ali Fuat Alkaya, Ekrem Duman, Akif Eyler
Abstract: In this study, a new algorithm for a particular component placement
machine is proposed. Also, a pairwise exchange procedure is designed and
applied after the proposed algorithm. In the analyzed machine, previously
proposed algorithms were mounting the components from lightest to heaviest
where we propose to mount the components from heaviest to lightest. This
method brings new opportunities in terms of performance gain on typical
Printed Circuit Boards produced in the industry. The new algorithm is compared
with former approaches on synthetically generated instances. It outperforms
the former approaches by 1.54 percent on printed circuit boards with 100
components to be placed. It gives better results in 96 out of 100 instances.
Applying the designed pairwise exchange procedure after the proposed algorithm
further improves the total assembly time. Moreover, by taking advantage of
inherent design of the analyzed PCBs, other promising improvement procedures
are also suggested for minimizing total assembly time. By applying all the
suggested techniques to the problem instances, performance improvement rate
reaches up to 4.5 percent when compared with previous studies. Since the
architecture and working principles of widely used new technology placement
machines are very similar to the one analyzed here, the improvement techniques
developed here can easily be generalized to them.
Keywords:
Printed Circuit Board Assembly, Traveling Salesman Problem, Placement Machine
Issue 5, Volume 7, May 2008
Title of the
Paper: Using Policy-based MPLS Management
Architecture to Improve QoS on IP Network
DOWNLOAD FULL PDF
Authors:
Ruey-Shun Chen, Yung-Shun Tsai, K. C. Yeh, H. Y. Chen
Abstract: Multi-Protocol Label Switching (MPLS) is in the process of
standardization by the Internet Engineering Task Force (IETF). It is regarded
as a technology for traffic engineering and QoS in IP-networks. We proposed an
IETF Policy-based Network Management Framework and policies with MPLS specific
classes. It uses a three-level policy architecture, which includes managing on
device, network, and service level using policies for supporting Inter-serve
and Diff-serve based end-to-end QoS in the Internet. A prototyping of
policy-based management system for MPLS Traffic Engineering is operating on
MPLS network elements. Several experiments illustrate the efficiency and
feasibility in this architecture. The results show it can reduce the time of
the setup of MPLS traffic engineering tunnel over hops and MPLS traffic
engineering tunnel deletion. The proposed integrated policy based management
architecture will allow network service providers to offer both quantitative
and qualitative services while optimizing the use of underlying network
resources.
Keywords:
Multiple Protocol Label Switching, Traffic Engineering, Quality of Service,
Policy-based Management
Title of the
Paper: Fuzzy Stroke Analysis of Devnagari
Handwritten Characters
DOWNLOAD FULL PDF
Authors:
Prachi Mukherji, Priti P. Rege
Abstract: Devnagari script is a major script of India widely used for various
languages. In this work, we propose a fuzzy stroke-based technique for
analyzing handwritten Devnagari characters. After preprocessing, the character
is segmented in strokes using our thinning and segmentation algorithm. We
propose Average Compressed Direction Codes (ACDC) for shape description of
segmented strokes. The strokes are classified as left curve, right curve,
horizontal stroke, vertical stroke and slanted lines etc. We assign fuzzy
weight to the strokes according to their circularity to find similarity
between over segmented strokes and model strokes. The character is divided
into nine zones and the occurrences of strokes in each zone and combinations
of zones are found to contribute to Zonal Stroke Frequency (ZSF) and Regional
Stroke Frequency (RSF) respectively. The classification space is partitioned
on the basis of number of strokes, Zonal Stroke Frequency and Regional Stroke
Frequency. The knowledge of script grammar is applied to classify characters
using features like ACDC based stroke shape, relative strength, circularity
and relative area. Euclidean distance classifier is applied for unordered
stroke matching. The system tolerates slant of about 10º left and right and a
skew of 5º up and down. The system proves to be fast and efficient with regard
to space and time and gives high discrimination between similar characters and
gives a recognition accuracy of 92.8%.
Keywords:
Devnagari Script, Segmentation, Strokes, Average Compressed Direction Code,
Zonal and Regional Stroke Frequency, Euclidean Classifier.
Title of the
Paper: Mamdani’s Fuzzy Inference
eMathTeacher: a Tutorial for Active Learning
DOWNLOAD FULL PDF
Authors:
M. Gloria Sanchez-Torrubia, Carmen Torres-Blanc, Sanjay Krishnankutty
Abstract: An eMathTeacher [16] is an eLearning on–line self–assessment tool
that helps users to active learning math concepts and algorithms by
themselves, correcting their mistakes and providing them with clues to find
the right solution. This paper introduces an example of a new concept on
Computer Aided Instruction (CAI) resources, i.e. a tutorial designed under
eMathTeacher philosophy for active eLearning Mamdani’s Direct Method, and
presents a brief survey on available CAI resources discussing what their
influence over students’ behaviour is. It also describes the minimum and
complementary requirements an eLearning tool must fulfil to be considered an
eMathTeacher as well as the main contributions of this kind of tutorials to
the learning processes. Needless to say that, such features as interactivity,
visualization and simplicity turn these tools into great value pedagogical
instruments.
Keywords:
eMathTeacher, eLearning, Active learning, Interactive Java applets, Fuzzy
Inference Systems (FIS), Fuzzy Logic, Computer Assisted Instruction (CAI).
Title of the
Paper: An Hybrid Simulated Annealing
Threshold Accepting Algorithm for Satisfiability Problems using Dynamically
Cooling Schemes
DOWNLOAD FULL PDF
Authors:
Felix Martinez-Rios, Juan Frausto-Solis
Abstract: For Satisfiability (SAT) Problem there is not a deterministic
algorithm able to solve it in a polynomial time. Simulated Annealing (SA) and
similar algorithms like Threshold Accepting (TA) are able to find very good
solutions of SAT instances only if their control parameters are correctly
tuned. Classical TA’s algorithms usually use the same Markov chain length for
each temperature cycle but they spend a lot of time. In this paper a new
hybrid algorithm is presented. This algorithm is in fact a TA algorithm which
is hybridized with SA in a certain way. For this TA algorithm, the Markov
chain length (L) is obtained in a dynamical way for each temperature. Besides,
it is known that TA and SA obtain very good results whether their parameters
are correctly tuned. Experimental tuning methods expend a lot of time before a
TA algorithm can correctly be executed; in other hand, analytical tuning
methods for TA were only completely developed for the geometrical cooling
function. This paper also shows how TA can be tuned for three common cooling
functions with an analytical model. Experimentation presented in the paper
shows that the new TA algorithm is more efficient than the classical one.
Keywords:
Simulated Annealing, Threshold Accepting, Cooling function, Dynamic Markov
Chains, SAT problem
Title of the
Paper: Strategic Planning for the Computer
Science Security
DOWNLOAD FULL PDF
Authors:
Jorge A. Ruiz-Vanoye, Ocotlan Diaz-Parra, Ismael Rafael Ponce-Medellin, Juan
Carlos Olivares-Rojas
Abstract: The necessity of the companies and organizations to adapt the
technological changes of the computer science takes to formulated key
questions: How to measure the security of my organization?, What type of
Computer Science Security needs my company, financial organization, or
government? Did my financial organization counts with aspects of computer
science security in the correct areas? What new tools of computer science
security exist? What security strategies we must follow? What is the data
stream of information that needs to be transmitted through the different
departments of my organization, in terms of computer security? What kind of
users’ roles exists in terms of organizational security? Is there a way to
classify the information in terms of computer security? In this paper we show
a methodology for strategic planning for the computer science security of
diverse companies like Banks and Government, cradle in the concepts of
strategic administration of enterprise politics, which tries to give answers
to the questions before mentioned.
Keywords:
Methodologies of Security, Strategic Planning, Computer Science Security.
Title of the
Paper: Seamless Multicast Handover in an
NC-HMIPv6 Environment
DOWNLOAD FULL PDF
Authors:
Lambert Kadjo Tanon, Souleymane Oumtanaga, Kone Tiemoman
Abstract: The multicast facilitates group communications in IP networks and
largely improves the usage efficiency of the bandwidth. However if this
transmission technology has reached a sufficient maturity in fixed networks,
it results in many problems in an IP environment where the receiver is mobile.
The problem is partially attributable to the protocols managing the host
mobility. Indeed, in the mobile Internet, the host mobility affects unicast
addresses which are considered in the multicast routing protocols as stable.
So the change of these unicast addresses lead to long handover latency and
packets losses due to interruptions provoked by the mobility. So, the best
adaptation of the multicast in the mobile Internet depends strongly on the
type of mobility protocol in use. The current propositions for applications of
multicast services are made in the mobility environments offering mobility
management insufficiently optimized according to the following essential
performance criteria: improvement of the handover latency, the scalability and
the packet loss rate. Our proposal, based on the NC-HMIPv6 protocol, offers a
better mobile multicast management through leaning the possibilities offered
by this protocol. By widening the features of various entities in the
NC-HMIPv6 environment, an effective management of the multicast handover is
proposed.
Keywords: Mobile IPv6, HMIPv6, NC-HMIPv6, NC-HMIPv6-M, Multicast,
ASM, MLD, PIM-DM, PIMSM, PIM-SSM.
Title of the
Paper: The Pi-ADL.NET project: An Inclusive
Approach to ADL Compiler Design
DOWNLOAD FULL PDF
Authors:
Zawar Qayyum, Flavio Oquendo
Abstract: This paper describes results and observations pertaining to the
development of a compiler utility for an Architecture Description Language
ð-ADL, for the .NET platform. Architecture Description Languages or ADLs, are
special purpose high level languages especially construed to define software
architectures. ð-ADL, a recent addition to this class of languages, is
formally based on the ð-Calculus, a process oriented formal method. The
compiler for ð-ADL, named ð-ADL.NET, is designed with the view of bringing the
architecture driven software design approach to the .NET platform. The process
oriented nature and a robust set of parallelism constructs of ð-ADL make the
ð-ADL.NET project a novel application of compiler techniques in the context of
the .NET platform, with many valuable lessons learnt. This paper presents the
ð-ADL.NET effort from a compiler design perspective, and describes the
inclusive approach driving the design that facilitates the representation of
strong behavioral semantics in architecture descriptions. The subjects of
parallel process modeling, communication and constructed data types are
covered. The paper also documents the motivation, vision and future
possibilities for this line of work. A detailed comparison with related work
is also presented.
Keywords:
ð-ADL, Compiler design, Aspect oriented programming, CIL, Software
architecture, Architecture description danguage
Title of the
Paper: A Novel Construction of Connectivity
Graphs for Clustering and Visualization
DOWNLOAD FULL PDF
Authors:
Wesam Barbakh, Colin Fyfe
Abstract: We [5, 6] have recently investigated several families of clustering
algorithms. In this paper, we show how a novel similarity function can be
integrated into one of our algorithms as a method of performing clustering and
show that the resulting method is superior to existing methods in that it can
be shown to reliably find a globally optimal clustering rather than local
optima which other methods often find. We discuss some of the current
difficulties with using connectivity graphs for solving clustering problems,
and then we introduce a new algorithm to build the connectivity graphs. We
compare this new algorithm with some famous algorithms used to build
connectivity graphs. The new algorithm is shown to be superior to those in the
current literature. We also extend the method to perform topology preserving
mappings and show the results of such mappings on artificial and real data.
Keywords:
Clustering, Similarity function, Connectivity graph, Visualisation, K-means,
Topographic mapping.
Title of the
Paper: Incorporating the Biometric Voice
Technology into the E-Government Systems to Enhance the User Verification
DOWNLOAD FULL PDF
Authors:
Khalid T. Al-Sarayrh, Rafa E. Al-Qutaish, Mohammed D. Al-Majali
Abstract: Many courtiers around the world have started their e-government
programs. E-government portals will be increasingly used by the citizens of
many countries to access a set of services. Currently, the use of the
e-government portals arises many challenges; one of these challenges is the
security issues. E-government portals security is a very important
characteristic in which it should be taken into account. In this paper, we
have incorporated the biometric voice technology into the e-government portals
in order to increase the security and enhance the user verification. In this
way, the security should be increased since the user needs to use his voice
along with his password. Therefore, no any unauthorized person can access the
e-government portal even if he/she knows the required password.
Keywords:
Security Systems, E-Government Portals, Biometric Voice, Speaker Verification,
Authentication.
Title of the
Paper: Smart Card based Solution for
Non-Repudiation in GSM WAP Applications
DOWNLOAD FULL PDF
Authors:
Cristian Toma, Marius Popa, Catalin Boja
Abstract: The paper presents security issues and architectures for mobile
applications and GSM infrastructure. The article also introduces the solution
for avoiding denial of service from WAP applications using WIM features. The
first section contains the structure of GSM network from voice and data point
of view. The security in GSM network is presented in second section. The third
section presents a solution for realizing mobile subscriber non-repudiation.
The solution is based on the HTTP protocol over WAP.
Keywords:
mobile security, m-application security, SAWNR - Secure Application for
Wireless Non Repudiation.
Title of the
Paper: Adapting a Legacy Code for Ordinary
Differential Equations to Novel Software and Hardware Architectures
DOWNLOAD FULL PDF
Authors:
Dana Petcu, Andrei Eckstein, Claudiu Giurgiu
Abstract: Modern software engineering concepts, like software as a service,
allow the extension of the legacy code lifetime and the reduction of software
maintenance costs. The transformation of a legacy code into a service is not
straightforward task, especially when the initial code was designed with a
rich user interface. A special case is presented in this paper, that of a
software code for solving ordinary differential equations. Initially designed
to use parallel computing techniques in the solving process, the code is now
modified to take advantages of the current multi-core architectures. The
transformation paths are general and can be followed by other similar legacy
codes.
Keywords:
Wrapper, Web service, Parallel methods, Multicore architectures, Ordinary
differential equations
Title of the
Paper: Integrated Information Systems in
Higher Education
DOWNLOAD FULL PDF
Authors:
Ana-Ramona Lupu, Razvan Bologa, Gheorghe Sabau, Mihaela Muntean
Abstract: The paper shortly presents the situation of the Romanian
universities regarding information systems implementation and deployment. The
information presented is the result of a study regarding the current state of
the Romanian universities in the process of data and information system
integration, performed at the end of 2007 in 35 accredited universities. This study
was used as a base for identifying and analyzing the main factors of influence
for developing an integrated university environment and for identifying
concrete action directions for accomplishing that integration.
Keywords:
Romanian universities, Management information systems, Enterprise Resource
Planning systems, Integrated information solutions
Title of the
Paper: Basis Path Test Suite and Testing
Process for WS-BPEL
DOWNLOAD FULL PDF
Authors:
Theerapong Lertphumpanya, Twittie Senivongse
Abstract: Web services technology offers a WS-BPEL language for business
process execution. The building block of WS-BPEL is those Web services
components that collaborate to realize a certain function of the business
process. Applications can now be built more easily by composing existing Web
services into workflows; each workflow itself is also considered a composite
Web service. As with other programs, basis path testing can be conducted on
WS-BPEL processes in order to verify the execution of every node of the
workflow. This paper discusses the generation of the test suite for basis path
testing of WS-BPEL and an accompanying tool that can be used by service
testers. The test suite consists of test cases, stubs of the constituent
services in the workflow, and auxiliary state services that assist in the
test; these are deployed when running a test on a particular WS-BPEL. The
paper presents also a testing process for service testers. A business process
of a market place is discussed as a case study.
Keywords:
Basis path testing, WS-BPEL, test cases, control flow graph, cyclomatic
complexity
Title of the
Paper: Advanced Computer Recognition of
Aesthetics in the Game of Chess
DOWNLOAD FULL PDF
Authors:
Azlan Iqbal, Mashkuri Yaacob
Abstract: This research intended to see if aesthetics within the game of chess
could be formalized for computer recognition since it is often appreciated and
sought after by human players and problem composers. For this purpose, Western
or International chess was chosen because there is a strong body of literature
on the subject, including its aesthetic aspect. Eight principles of aesthetics
and ten themes were identified. Flexible and dynamic formalizations were
derived for each one and cumulatively represent the aesthetic score for a move
combination. A computer program that incorporated the formalizations was
developed for testing purposes. Experiments were then performed comparing sets
of thousands of chess compositions (where aesthetics is generally more
prominent) and regular games (where it is not). The results suggest that
computers can recognize beauty in the game. Possible applications of this
research include more versatile chess database search engines, more accurate
automatic chess problem composers, enhanced automatic chess game commentators
and computational aid to judges of composition and brilliancy tournaments. In
addition, the methodology applied here can be used to gauge aesthetics in
similarly complex games such as go and generally to develop better game
heuristics.
Keywords:
aesthetics, chess, game, evaluation, intelligence, computation
Title of the
Paper: Virtual Reality Approach in Treating
Acrophobia: Simulating Height in Virtual Environment
DOWNLOAD FULL PDF
Authors:
Nazrita Ibrahim, Mustafa Agil Muhamad Balbed, Azmi Mohd Yusof, Faridah Hani
Mohammed Salleh, Jaspaljeet Singh, Mohamad Shahrul Shahidan
Abstract: Acrophobia is a scientific term used to describe the fear of height.
To some people, this fear is manageable, but to others, the fear could pose
danger to their life if it starts to interfere with their day to day
activities. The conventional treatment for acrophobia is usually done through
exposure therapy, where individuals suffering acrophobia will be gradually
exposed (physically) to height. The disadvantage of conventional treatment is
that it could put the sufferers in life threatening situation. Therefore, the
goal of this study is to investigate whether it is possible to create the
presence of height using a simple 3D virtual environment, which later could be
used in exposure therapy for acrophobia. The system consists of a multimedia
workstation, a Head Mounted Display (HMD) and a virtual scene of a busy city
surrounded by tall buildings. The experiment consists of the users being
gradually lifted up and down on an open elevator hanging outside one of the
buildings. Set of questions being asked to each participant after the
experiment, and the results shown that even with simple 3D virtual
environment, the simulation of height could be achieved.
Keywords:
Virtual Reality, Acrophobia, 3D Environment, Virtual Environment, Virtools, VR
system treatment
Title of the
Paper: Spatial Decision Support System for
Reclamation in Opencast Coal Mine Dump
DOWNLOAD FULL PDF
Authors:
Yingyi Chen, Daoliang Li
Abstract: This paper describes the development and applications of a decision
support system that uses spatial information techniques and field survey data.
The SDSS can enable the various levels of the target groups to easily identify
the best available solutions of reclamation problems. To make reasonable
decisions, the overall purpose of the SDSS is to provide a support tool for
site evaluation and selection of the most appropriate reclamation schemes. The
system consists of three models: (1) an evaluation model for reclamation
potentiality based on the physical, chemical, and biological growth-limiting
factors in the target area. (2) Fuzzy similarity models to determine the
native plant species and metal-tolerant plants. (3) A case-based and
rule-based model to select the most appropriate reclamation schemes based on
the similarity in the physical, chemical, and biological growth conditions.
The models are developed in the C# language and integrated with GIS. All these
models are integrated in the SDSS which is able to provide information
concerning the recommended reclamation technologies for each case. Finally,
the uses of the SDSS in two cases which are Haizhou (China) and Mao Moh Mine (Thailand) are described.
Keywords:
Coal mine waste land, Reclamation, Spatial decision support system, Remote
sensing, Geographical information system
Title of the
Paper: On Application of SOA to Continuous
Auditing
DOWNLOAD FULL PDF
Authors:
Huanzhuo Ye, Shuai Chen, Fang Gao
Abstract: In today’s fast paced business world, there is growing interest in
the concept of continuous auditing. Although, current technology makes
continuous auditing possible, it still faces some problems. The main problems
consist of the accuracy of the data, the real-time and comprehensiveness of
the audit, and the flexibility of the audit. In order to solve these problems,
a SOA-based conceptual model for continuous auditing is proposed in this
paper. The two main models of SOA (service registry model and enterprise
service bus model) are all applied to this conceptual model. The model also
requires the user interface to be separated from the client’s management
information system so that the auditing systems can monitor the process of the
transactions between the client and the third parities. The model shows how
the new technology using ESB, XBRL, The shadow subsystem
-Just-In-Time
database- and the Intelligent Agent can help
effectively audit the transactions between the client and the third parities.
Finally, a SOA-based conceptual model for continuous auditing is presented to
provide the real-time, comprehensive, and flexible of the audit.
Keywords:
Continuous auditing; SOA; ESB; Real-time audit; Model
Title of the
Paper:
Nonlinear System Identification
with a Feedforward Neural Network and an Optimal Bounded Ellipsoid Algorithm
DOWNLOAD FULL PDF
Authors:
Jose De Jesus Rubio Avila,
Andres Ferreyra Ramirez, Carlos Aviles-Cruz
Abstract: Compared to normal learning algorithms, for example backpropagation,
the optimal bounded ellipsoid (OBE) algorithm has some better properties, such
as faster convergence, since it has a similar structure as the Kalman filter
algorithm. Optimal bounded ellipsoid algorithm has some better properties than
the Kalman filter training, one is that the noise is not required to be
Guassian. In this paper optimal bounded ellipsoid algorithm is applied train
the weights of a feedforward neural network for nonlinear system
identification. Both hidden layers and output layers can be updated. In order
to improve robustness of the optimal of the optimal bounded ellipsoid
algorithm, dead-zone is applied to this algorithm. From a dynamic systems
point of view, such training can be useful for all neural network applications
requiring real-time updating of the weights. Two examples where provided which
illustrate the effectiveness of the suggested algorithm based on simulations.
Keywords:
Neural Networks, Optimal Bounded Ellipsoid (OBE), Modeling, Identification.
Title of the
Paper: Comparison Studies on Classification
for Remote Sensing Image Based on Data Mining Method
DOWNLOAD FULL PDF
Authors:
Hang Xaio, Xiubin Zhang
Abstract: Data mining methods have been widely applied on the area of remote
sensing classification in recent years. In these methods, neural network,
rough sets and support vector machine (SVM) have received more and more
attentions. Although all of them have great advantages on dealing with
imprecise and incomplete data, there exists essential difference among them.
Until now, researches of these three methods have been introduced in lots of
literatures but how to combine these theories with the application of remote
sensing is an important tendency in the later research. However, all of them
have their own advantage and disadvantage. To reveal their different
characters on application of remote sensing classification, neural network,
rough sets and support vector machine are applied to the area of remote
sensing image classification respectively. Comparison result among these three
methods will be helpful for the studies on emote sensing image classification.
And also the paper provides us a new viewpoint on remote sensing image
classification in the future work.
Keywords:
remote sensing image classification, data mining, neural network, variable
precision rough sets model, support vector machine, comparison studies
Title of the
Paper: Monitoring Event-Based Suspended
Sediment Concentration by Artificial Neural Network Models
DOWNLOAD FULL PDF
Authors:
Yu-Min Wang, Seydou Traore, Tienfuan Kerh
Abstract: This paper is concerned with monitoring the hourly event-based river
suspended sediment concentration (SSC) due to storms in Jiasian diversion weir
in southern Taiwan.
The weir is built for supplying 0.3 million tons of water per day averagely
for civil and industrial use. Information of suspended sediments fluxes of
rivers is crucial for monitoring the quality of water. The issue of water
quality is of particular importance to Jiasian area where there are high
population densities and intensive agricultural activities. Therefore, this
study explores the potential of using artificial neural networks (ANNs) for
modeling the eventbased SSC for continuous monitoring of the river water
quality. The data collected include the hourly water discharge, turbidity and
SSC during the storm events. The feed forward backpropagation network (BP),
generalized regression neural network (GRNN), and classical regression were
employed to test their performances. From the statistical evaluation, it has
been found that the performance of BP was slightly better than GRNN model. In
addition, the classical regression performance was inferior to ANNs.
Statistically, it appeared that both BP (r2=0.930) and GRNN (r2=0.927) models
fit well for estimating the event-based SSC in the Jiasian diversion weir. The
weir SSC estimation using a single input data with the neural networks showed
the dominance of the turbidity variable over water discharge. Furthermore,
using the the ANN models are more reliable than classical regression method
for estimating the SSC in the area studied herein.
Keywords:
Artificial neural networks, river monitoring, suspended sediment
concentration, turbidity, water discharge, modeling.
Title of the
Paper: New Color Correction Method of
Multi-view Images for View Rendering in Free-viewpoint Television
DOWNLOAD FULL PDF
Authors:
Feng Shao, Gangyi Jiang, Mei Yu
Abstract: Color inconsistency between views is an important problem to be
solved in multi-view video imaging application, such as free viewpoint
television. Up to now, some color correction methods have been proposed mainly
for consistent color appearance or high coding efficiency, in which the mean,
variance, or covariance information are used to transfer color information. In
this paper, by using color restoration and linear regression technique, a new
color correction method of multi-view images is proposed for view rendering.
We first separate foreground and background from scene by mean-removed
disparity estimation. Then color restoration is used to reconstruct the
original color information for foreground and background. And then linear
regression technique is used to estimate correction parameters for color
restored image. Finally, color correction and view rendering between reference
image and color corrected image are implemented. Experimental results show
that he proposed method can achieve better performances of color correction
and view rendering.
Keywords:
Multi-view image, free viewpoint television, color correction, view rendering,
expectationmaximization, linear regression, CIEDE2000.
Issue 6, Volume 7, June 2008
Title of the
Paper: The
Strategy based on Game Theory for Cross-Organizational Business Process
Reengineering in Supply Chain
DOWNLOAD FULL PDF
Authors:
Jian-Feng Li, Yan Chen, Xu-Sheng Cui
Abstract: Many enterprises with their own benefits are involved in
cross-organizational business process reengineering in supply chain, which is
different from BPR within one enterprise. The enterprises have rights to take
part in the cross-organizational BPR project or not, in this way, their
activities under the benefits will affect the progress of the project and the
different results will be gotten. It has been an important matter which hasn’t
been worked over deeply in the current research, so this paper probes into
that problem and analyzes the reengineering strategy in supply chain, i.e. how
to adopt the proper method based on game theory in the analysis of
enterprises’ interrelating actions under their benefits for BPR in order to
achieve good results. Concretely, the amalgamation of reengineering benefits
for different enterprises, the relationship of reengineering activities and
the effects of different reengineering modes are investigated deeply. This
paper studies the benefits and activities of reengineering entities and
emphasizes the various and inducing ability of the reengineering method, which
enriches the current research of BPR and has intrinsic value on the actual
project of cross-organizational BPR in supply chain.
Keywords:
Game theory; Business process reengineering; Cross-organization; Supply chain;
Reengineering strategy; Project management.
Title of the
Paper:
New View Generation Method for Free-Viewpoint Video System
DOWNLOAD FULL PDF
Authors:
Gangyi Jiang, Liangzhong Fan, Mei Yu, Feng Shao
Abstract: View navigation is an attracted function of free viewpoint video
(FVV) system. Disparity map which describes correspondence between adjacent
views plays an important role in view generation. However, disparity
estimation is complicated and time consuming for user side with constrained
resource environment. In view of disparity compensation prediction algorithm
is usually utilized at the encoder to lift up the coding efficiency, block
based disparity map can be easily obtained without more extra efforts.
Therefore, a framework of view generation oriented FVV system is presented in
this paper, in which block based disparity map is generated and encoded with
CABAC losslessly at the server side, while at the user side, the received
block based disparity map is refined for view generation, and spatial
correlation in block based disparity map is utilized to accelerate the
generation of pixel-wise disparity map. Experimental results show that the
proposed framework makes a tradeoff between the transmission cost and effort
of view generation.
Keywords:
Free viewpoint video, view generation, block based disparity map, disparity
refinement, rayspace.
Title of the
Paper: Cost Effective Software Test Metrics
DOWNLOAD FULL PDF
Authors:
Ljubomir Lazic, Nikos Mastorakis
Abstract: This paper discusses software test metrics and their ability to show
objective evidence necessary to make process improvements in a development
organization. When used properly, test metrics assist in the improvement of
the software development process by providing pragmatic, objective evidence of
process change initiatives. This paper also describes several test metrics
that can be implemented, a method for creating a practical approach to
tracking & interpreting the metrics, and illustrates one organization’s use of
test metrics to prove the effectiveness of process changes. Also, this paper
provides the Balanced Productivity Metrics (BPM) strategy and approach in
order to design and produce useful project metrics from basic test planning
and defect data. Software test metrics is a useful for test managers, which
aids in precise estimation of project effort, addresses the interests of
metric group, software managers of the software organization who are
interested in estimating software test effort and improve both development and
testing processes.
Keywords:
Software testing, Test metrics, Size estimation, Effort estimation, Test
effectiveness evaluation
Title of the
Paper: Implementation of Data-Exchanging
System based on Message Oriented Middleware in Agricultural Website
DOWNLOAD FULL PDF
Authors:
Zhang Xiaoshuan, Wu Qinghua, Tian Dong, Zhao Ming
Abstract: With the startup of the Golden Agriculture Project, the step of
being-information of agriculture is becoming rapid. And the transformation and
share of data is indispensable to the being-information of agriculture. How to
implement data combination, data transformation and data receiving
applications are the important means to complete the information share safely
and enhance the efficiency. The paper starts with searching of methods to
implement data interchange, and introduce some of the methods, points of the
techniques, etc. Basing on this, the paper also introduces the detail
requirement analyses, system design and detail implementation of the system.
According to the requirement and trait of the project, a data interchange
system is researched and completed. And a data interchange model based on
message-oriented middleware (MOM) is presented in this paper, which builds a
middleware between the province and the ministry taking part in data
interchange. The system has traits as follows: 1. keeping the data safe and
credible while it is transformed. 2. having excellent transplantable and
applied capability. 3. doesn’t need intervention of workman in the process of
data interchange. 4. applying the data interchange between databases of
different structure. 5. being simple to be developed and applied. MOM
TongLink/Q offers interfaces for application development, and it completes the
data transformation through the internet. The integration adapters developed
do the data management, which are developed based on the Frame for
Applications Integration TongIntegrator. This method offers a new approach to
resolve the question of data interchange. Now the system has been successfully
applied in the data interchange project of Ministry of Agriculture.
Keywords:
Message Oriented Middleware, Data Exchange, long-range Database, Website, Data
Exchange
Title of the
Paper: Making A CASE for PACE: Components
of the Combined Authentication Scheme Encapsulation for a Privacy Augmented
Collaborative Environment
DOWNLOAD FULL PDF
Authors:
Geoff Skinner
Abstract: Digital Collaborations are proving themselves as ideal environments
for increasing the productivity and knowledge exploration capabilities of
their members. Many organizations are realizing the diverse range of benefits
they provide to not only their organization as a whole but also to individual
employees. The challenge in environments that encourage the sharing of
resources, in particular data, is finding a sustainable balance between the
need to provide access to data while also insuring its security in addition to
the privacy of the entities it may pertain to. In this paper we propose an
authentication framework that uniquely combines both traditional and biometric
methods of authentication with an additional novel audiovisual method of
authentication. The CASE (Combined Authentication Scheme Encapsulation)
methodology, the name of our solution, provides an effective visual
representation of both the authentication and information privacy hierarchies
associated with data requests within digital collaborative environments.
Keywords:
Information Privacy, Data Security, Authentication, Personal Identity, Digital
Collaborations
Title of the
Paper: The Relationship between Educational
Serious Games, Gender, and Students’ Social Interaction
DOWNLOAD FULL PDF
Authors:
Samah Mansour, Mostafa El-Said
Abstract: Internet age students are increasingly interested in learning by
playing. The majority of the current educational computer games suffer from
the inapplicability of supporting the course materials’ learning objectives.
In consequence, the integration of educational video games into the curriculum
usually met with resistance from some teachers, administrators, and parents.
Multi-player educational serious games (MPESGs) are introduced as a new type
of educational computer games. Educators and researchers increasingly believe
in MPESGs as a tool for interactive learning. MPESGs might not only motivate
students to learn, but also provide them with innovative ways to develop
understandings of abstract concepts. In addition, the integration of MPESGs in
learning environment might promote social interaction among students. This
study focused on exploring MPESGs as a new educational tool. A MPESG called
The Village of Belknap was developed in Second Life to be used as a prototype
in this study. Experimental results were carried out and the results indicated
that the gender did not influence students’ perceptions of social interaction
during playing the game. In addition, the results revealed that the
integration of the MPESG in the learning process did not lead to a significant
difference in the perception of social interaction between the students who
participated in the online session and the students who participated in the
face-to-face session.
Keywords:
Avatar, CVE, Educational Games, Role-Playing Game, Second Life, Serious Game.
Title of the
Paper: Human Emotion Recognition System
Using Optimally Designed SVM With Different Facial Feature Extraction
Techniques
DOWNLOAD FULL PDF
Authors:
G. U. Kharat, S. V. Dudul
Abstract: This research aims at developing “Humanoid Robots” that can carry
out intellectual conversation with human beings. The first step in this
direction is to recognize human emotions by a computer using neural network.
In this paper all six universally recognized basic emotions namely angry,
disgust, fear, happy, sad and surprise along with neutral one are recognized.
Various feature extraction techniques such as Discrete Cosine Transform (DCT),
Fast Fourier Transform (FFT), Singular Value Decomposition (SVD) are used to
extract the useful features for emotion recognition from facial expressions.
Support Vector Machine (SVM) is used for emotion recognition using the
extracted facial features and the performance of various feature extraction
technique is compared. Authors achieved 100% recognition accuracy on training
dataset and 94.29% on cross validation dataset.
Keywords:
Discrete Cosine Transform (DCT), Fast Fourier Transform (FFT), Singular Value
Decomposition (SVD) Support Vector Machine (SVM), Machine Intelligence.
Title of the
Paper: Determination of Insurance Policy
Using a hybrid model of AHP, Fuzzy Logic, and Delphi
Technique: A Case Study
DOWNLOAD FULL PDF
Authors:
Chin-Sheng Huang, Yu-Ju Lin, Che-Chern Lin
Abstract: Based on a previous study, this paper presents evaluation models for
selecting insurance policies. Five models have been built for five insurances,
respectively, including life, annuity, health, accident, and
investment-oriented insurances. The proposed models consist of analytical
hierarchy process (AHP), fuzzy logic and the Delphi
technique. The Delphi technique is employed
to select inputs, define fuzzy expressions, and generate evaluation rules for
the models. Four variables are selected as the inputs including age, annual
income, educational level and risk preference. These inputs are transferred to
fuzzy variables using trapezoidal membership functions and then fed to AHP. To
build the models, we interviewed twenty domain experts with at least three
years of working experience in insurance companies. To validate the
performance, we designed a computer program and used 300 insurance purchase
records to examine the evaluation results. Validation results and conclusive
remarks are also provided at the end of this paper.
Keywords:
Insurance policy; Decision making; Fuzzy logic; AHP;
Delphi
technique; Evaluation model
Title of the
Paper: Implementation of a Modified PCX
Image Compression Using Java
DOWNLOAD FULL PDF
Authors:
Che-Chern Lin, Shen-Chien Chen
Abstract: In this paper, we present a new image compression algorithm based on
the PCX algorithm, an image compression method used in the computer package of
PC Paintbrush Bitmap Graphic. We first introduce the principles of image
compressions and the structure of image file formats. We demonstrate the
procedures of compression and decompression of the PCX algorithm. The original
PCX algorithm only compresses one fourth of data using run-length encoding.
The compression efficiency depends on the repeatability of data in the
compressed area. If the repeatability is low, the compression performance will
be bad. To avoid this, we propose a modified PCX algorithm which selects the
best area for compression. We designed a computer package to implement the
modified PCX algorithm using java programming language. The Unified Modeling
Language (UML) was used to describe the structure and behaviors of the
computer package. The pseudo codes for the compression and decompression of
the modified PCX algorithm are also provided in this paper. We did an
experiment to compare the performance between the original and modified
algorithms. The experimental results show that the modified PCX algorithm is
better than the original one in compression performance.
Keywords:
PCX, Data compression, Image compression, Run-length encoding
Title of the
Paper: QoS Multilayered Multicast Routing
Protocol for Video Transmission in Heterogeneous Wireless Ad Hoc Networks
DOWNLOAD FULL PDF
Authors:
Osamah Badarneh, Michel Kadoch, Ahamed Elhakeem
Abstract: In wireless ad hoc networks, nodes are expected to be heterogeneous
with a set of multicast destinations greatly differing in their end devices
and QoS requirements. This paper proposes two algorithms for multilayered
video multicast over heterogeneous wireless ad hoc networks. The two
algorithms are, Multiple Shortest Path Tree (MSPT) and Multiple Steiner
Minimum Tree (MSMT). In this paper, we assume that each destination has a
preference number of video layers; which is equal to its capacity. Moreover,
we do not consider only the capacities of nodes in the network but also the
bandwidth of each link. In order to increase user satisfaction for a group of
heterogeneous destinations, we exploit different types of multiple multicast
trees policy. Simulations show that the proposed schemes greatly improve the
QoS requirements (increase user satisfaction) for a set of destinations. In
addition, simulations show that multiple Hybrid-II multicast trees offer
higher user satisfaction than multiple Hybrid-I multicast trees and multiple
node-disjoint trees. The cost of that is the robustness against link failure.
Therefore, it is a trade off between providing robustness against path breaks
and increasing user satisfaction.
Keywords:
Multilayered multicast; MDC; LC, Heterogeneous wireless ad hoc networks.
Title of the
Paper: The Investigation of the Elliptic
Curve Cryptology Applies to the New Generation Protocol
DOWNLOAD FULL PDF
Authors:
Po-Hsian Huang
Abstract: We face threats from virus, hacker, electronic eavesdropper, and
electronic deceiver in the network environment. The safety issue is truly
important. The growth of the computer and network system make the
organizations and personal users to depend on the information that circulate
in these systems more and more. For the reason, we must protect the data and
resources from leak and insure the reliability of data and information.
Besides, the system can be protected well when it is suffered any attacks from
networks. Not only people start to care about the network security level where
they use credit cards to order they want but also they have the idea about the
network security.[1] The most well-known and the most extensive applied public
key cryptosystem RSA is submitted by Rivet, Shamir and Adelman [2]. Its safety
is based on the difficulty of the factorization of big integer. The problem of
factorization of big integer is a well-known difficult mathematical question.
Up to now, less valid system is mentioned to solve the problem. Therefore, the
algorithm of RSA can insure the safety. For the sake of the security of RSA,
the bit of key is always in the advance. The more key length will cause the
speed of encryption and decryption to decline. Besides, the hardware will
become more and more difficult to be accepted. This phenomenon brings very
heavy load for RSA applications. For proceeding the large number of safe
transactions on E-Commerce also will encounter the same problems. In Elliptic
Curve Cryptology, its bytes are less than RSA (Rivest-Shamir-Adleman). It can
let computer performance and network transmission become good and fast.
Although the next generation protocol modifies from IPv4 and applies to IPSEC,
it is not perfect in network security. The unit of IP Security Protocol can’t
coexist with the unit of IPv6 protocol service when IPv6 packets pass IPv4
protocol and Network Address Transfer’s field. For those weaknesses, one will
discuss Elliptic Curve Cryptology how to translate in IPv6 protocol and
compare the performance of Elliptic Curve Cryptology and RSA in IPv6 protocol.
Keywords:
Elliptic Curve Cryptology, IPv6, RSA
(Rivest-Shamir-Adleman)
Title of the
Paper: Neural Network Approach for
Estimating Reference Evapotranspiration from Limited Climatic Data in Burkina Faso
DOWNLOAD FULL PDF
Authors:
Yu-Min Wang, Seydou Traore, Tienfuan Kerh
Abstract: The well known Penman-Monteith (PM) equation always performs the
highest accuracy results of estimating reference evapotranspiration (ETo)
among the existing methods is without any discussion. However, the equation
requires climatic data that are not always available particularly for a
developing country such as
Burkina Faso. ETo has been widely used for
agricultural water management. Its accurate estimation is vitally important
for computerizing crop water balance analysis. Therefore, a previous study has
developed a reference model for Burkina Faso (RMBF) for estimating the ETo by
using only temperature as input in two production sites, Banfora and Ouagadougou. This paper
investigates for the first time in the semiarid environment of Burkina Faso,
the potential of using an artificial neural network (ANN) for estimating ETo
with limited climatic data set. The ANN model employed in the study was the
feed forward backpropagation (BP) type using maximum and minimum air
temperature collected from 1996 to 2006. The result of BP was compared to the
RMBF, Hargreaves (HRG) and Blaney-Criddle (BCR) which have been successfully
used for ETo estimation where there is not sufficient data. Based on the
results of this study, it revealed that the BP prediction showed a higher
accuracy than RMBF, HRG and BCR. The feed forward backpropagation algorithm
could be potentially employed successfully to estimate ETo in semiarid zone.
Keywords:
Evapotranspiration, estimating, limited climatic data, neural network, feed
forward backpropagation, semiarid environment, water management
Title of the
Paper: Dynamics Modeling and Trajectory
Tracking Control for Humanoid Jumping Robot
DOWNLOAD FULL PDF
Authors:
Zhao-Hong Xu, Libo-Song, Tian-Sheng Lu, Xu-Yang Wang
Abstract: Jumping or running belong to a non-regular motion in humanoid robot
field. Stance phase has only the holonomic constraints, nevertheless flight
phase has the holonomic and non-holonomic constraints. The relationship
between angular moment and cyclic coordinate is analyzed by dynamics
equations. The impact model is expatiated by transferring dynamics equation
from flight phase to stance phase, and the impact force and velocity is
obtained by Jacobian matrix. The method of eliminating shock is discussed.
Finally, trajectory tracking using hybrid model of the adaptive fuzzy control
and computed torques is studied on model with parametric uncertainty. The
hybrid control system proves to be asymptotical stabilization by Lyapunov
stability theory. The numerical simulation and experimental results show
computed torque controller is valid for jumping movement.
Keywords:
jumping robot, dynamics modeling, non-regular motion, trajectory tracking
Title of the
Paper: Implementation of an Image Retrieval
System Using Wavelet Decomposition and Gradient Variation
DOWNLOAD FULL PDF
Authors:
Kuo-An Wang, Hsuan-Hung Lin, Po-Chou Chan, Chuen-Horng Lin, Shih-Hsu Chang,
Yung-Fu Chen
Abstract: Texture gradient is a popular operation for extracting features used
for content-based image retrieval (CBIR) of texture images. It is useful for
depicting gradient magnitude and direction of adjacent pixels in an image. In
this thesis, we proposed two methods for retrieving texture images. In the
first method, discrete wavelet transform (DWT) and gradient operation were
combined to extract features of an image with principal component analysis
(PCA) used to determine weights of individual extracted features, while in the
second method, only gradient operation without involvement of discrete wavelet
transform was used to extract features. The Brodatz Album which contains 112
texture images, each has the size of 512×512 pixels, was used to evaluate the
performance of the proposed methods. Before experiment, each image was cut
into sixteen 128×128 non-overlapping sub-images, thus creating a database
consisting of 1792 images. Regarding the number of features, a total of 126
features were extracted in the first method by calculating gradients after
discrete wavelet transforms of the texture image, while in the second method
only 54 features were extracted from each gradient image. By integrating
useful features, image retrieval systems for retrieving texture images have
been designed. The results show that the two proposed methods have been
demonstrated to be able to achieve better retrieval accuracy than the method
proposed by Huang and Dai. Additionally, our proposed systems, especially the
second proposed method, use fewer features which significantly decrease the
retrieval time compared to the previous investigation.
Keywords:
Content-Based Image Retrieval, Texture, Gradient Operation, Entropy, Discrete
Wavelet Transform (DWT), Principal component analysis
Title of the
Paper: WhiteSteg: A New Scheme in
Information Hiding Using Text Steganography
DOWNLOAD FULL PDF
Authors:
L. Y. Por, T. F. Ang, B. Delina
Abstract: Sending encrypted messages frequently will draw the attention of
third parties, i.e. crackers and hackers, perhaps causing attempts to break
and reveal the original messages. In this digital world, steganography is
introduced to hide the existence of the communication by concealing a secret
message inside another unsuspicious message. The hidden message maybe
plaintext, or any data that can be represented as a stream of bits.
Steganography is often being used together with cryptography and offers an
acceptable amount of privacy and security over the communication channel. This
paper presents an overview of text steganography and a brief history of
steganography along with various existing techniques of text steganography.
Highlighted are some of the problems inherent in text steganography as well as
issues with existing solutions. A new approach, named WhiteSteg is proposed in
information hiding using inter-word spacing and inter-paragraph spacing as a
hybrid method to reduce the visible detection of the embedded messages.
WhiteSteg offers dynamic generated cover-text with six options of maximum
capacity according to the length of the secret message. Besides, the advantage
of exploiting whitespaces in information hiding is discussed. This paper also
analyzes the significant drawbacks of each existing method and how WhiteSteg
could be recommended as a solution.
Keywords:
Steganography,Text Steganography, Information Hiding, Security, Suspicion.
Title of the
Paper: Parameter Adjustment for Genetic
Algorithm for Two-Level Hierarchical Covering Location Problem
DOWNLOAD FULL PDF
Authors:
Miroslav Maric,
Milan Tuba,
Jozef Kratica
Abstract: In this paper the two-level Hierarchical Covering Location Problem -
HCLP is considered. A new genetic algorithm for that problem is developed,
including specific binary encoding with the new crossover and mutation
operators that keep the feasibility of individuals. Modification that resolves
the problem of frozen bits in genetic code is proposed and tested. Version of
fine-grained tournament [5] was used as well as the caching GA technique [12]
in order to improve computational performance. Genetic algorithm was tested
and its parameters were adjusted on number of test examples and it performed
well and proved robust in all cases. Results were verified by CPLEX.
Keywords:
Genetic Algorithms, Evolutionary computing, Location problem, Hierarchical
location, Covering models
Title of the
Paper: Network Motif & Triad Significance
Profile Analyses on Software System
DOWNLOAD FULL PDF
Authors:
Zhang Lin, Qian GuanQun, Zhang Li
Abstract: There has been considerable recent interest in network motif for
understanding network local features, as well as the growth and evolution
mechanisms. In order to discover the local features in software networks, we
extended the network motif research methods to software domain. After
comparing triad significance profiles from 138 java open source software
packages, we found three typical kinds of network motifs. Moreover, the
software networks could be divided into 3 clusters which are consistent with
the known super-families from various other types of networks. It seems that
software scale and interaction may be the reasons causing different motif SP
distribution. The concepts,the principles and
steps associated with the experiment were elaborated, as well as the results
were analyzed and discussed, the direction for further research was given.
Keywords:
Software Network, Network Motif, Triad Significance Profile, Superfamily
Title of the
Paper: OCR
for Printed Kannada Text to Machine Editable Format using Database Approach
DOWNLOAD FULL PDF
Authors:
B. M. Sagar, Shobha G., Ramakanth Kumar P.
Abstract: This paper describes an Optical Character Recognition (OCR) system
for printed text documents in Kannada, a South Indian language. The proposed
OCR system for the recognition of printed Kannada text, which can handle all
types of Kannada characters. The system first extracts image of Kannada
scripts, then from the image to line segmentation then segments the words into
sub-character level pieces. For character recognition we have used database
approach. The level of accuracy reached to 100%.
Keywords:
Optical Character Recognition, Segmentation, Kannada Scripts
Title of the
Paper: Establishment of Computational
Models for Clothing Engineering Design
DOWNLOAD FULL PDF
Authors:
Amo Aihua, Li Yi, Wang Ruomei, Lou Xiaonan
Abstract: Clothing design achieved by engineering framework and method is a
newly interest in the state of art of textile research, which may create many
advantages, such as improving the design efficiency and strengthening the
ability to consider more issues. The mathematical models describing the behind
physical and chemical mechanisms of the involved various behaviors inside the
textile materials and the interactions at its boundary with external
environment play an significant role in the clothing engineering design
system, since they are able to offer designers/users an advanced ability to
simulate and preview the function performance of the textile products.
Whereas, the assumptions and simplifications of the development of
computational models determine their potential/suitability to be developed as
CAD/simulation software for engineering design applications. Considering this
issue, the criteria for selecting suitable computational models for
engineering design purposes is of enormous importance in the development of
engineering design CAD system, and the communication sockets between the
boundaries of different models integrated for simulating the different
functions in the human body-clothing-environment system need to be developed
with scientific and accurate data flows. In this paper, with the development
of the framework of engineering design of textile product, the discussion of
models selection criteria is presented considering the behind physical
mechanism, application limitation, parameters measurability and data
availability. A critical analysis of the well known models for the thermal
behaviors in the clothing is conducted, and thermal functional engineering
design systems for textile products with the integration of different
computational models as well as boundary communication sockets are
demonstrated.
Keywords:
Computational models, selection criteria, engineering design, thermal
functions, clothing
Title of the
Paper: Vehicle Number Plate Recognition
Using Mathematical Morphology and Neural Networks
DOWNLOAD FULL PDF
Authors:
Humayun Karim Sulehria, Ye Zhang, Danish Irfan, Atif Karim Sulehria
Abstract: This paper presents a method for recognition of the vehicle number
plate from the image using neural nets and mathematical morphology. The main
theme is to use different morphological operations in such a way so that the
number plate of the vehicle can be extracted efficiently. The method makes the
extraction of the plate independent of color, size and location of number
plate. The proposed approach can be divided into simple processes, which are,
image enhancement, morphing transformation, morphological gradient,
combination of resultant images and extracting the number plate from the
objects that are left in the image. Then segmentation is applied to recognize
the plate using neural network. This algorithm can quickly and correctly
recognize the number plate from the vehicle image.
Keywords:
Mathematical morphology, morphological gradient, vehicle number plate,
morphing transformations, image enhancement.
Title of the
Paper: Management Agent for Search
Algorithms with Surface Optimization Applications
DOWNLOAD FULL PDF
Authors:
Jukkrit Kluabwang, Deacha Puangdownreong, Sarawut Sujitjorn
Abstract: This paper presents a management approach applied to search
algorithms to achieve more efficient search. It acts as a management agent to
a core search unit, in which the Adaptive Tabu Search (ATS) has been applied.
The proposed management agent composes of partitioning mechanism (PM),
sequencing method (SM), and discarding mechanism to speed up the search. It
has been tested against Bohachevsky’s, Rastrigin’s and Shekel’s foxholes
functions, respectively, for surface optimization. The paper gives a review of
the ATS, detailed explanations of the PM, SM, and DM, respectively. Comparison
of the optimization results are elaborated.
Keywords:
search algorithms, management agent, partitioning mechanism, discarding
mechanism, sequencing method, adaptive tabu search
Title of the
Paper: Light Weight Log Management
Algorithm for Removing Logged Messages of Sender Processes with Little
Overhead
DOWNLOAD FULL PDF
Authors:
Jinho Ahn
Abstract: Sender-based message logging allows each message to be logged in the
volatile storage of its corresponding sender. This behavior avoids logging
messages on the stable storage synchronously and results in lower failure-free
overhead than receiver-based message logging. However, in the first approach,
each process should keep in its limited volatile storage the log information
of its sent messages for recovering their receivers. In this paper, we propose
a 2-step algorithm to efficiently remove logged messages from the volatile
storage while ensuring the consistent recovery of the system in case of
process failures. As the first step, the algorithm eliminates useless log
information in the volatile storage with no extra message and forced
checkpoint. But, even if the step has been performed, the more empty buffer
space for logging messages in future may be required. In this case, the second
step forces the useful log information to become useless by maintaining a
vector to record the size of the information for every other process. This
behavior incurs fewer additional messages and forced checkpoints than existing
algorithms. Experimental results verify that our algorithm significantly
performs better than the traditional one with respect to the garbage
collection overhead.
Keywords:
Distributed systems, Fault-tolerance, Rollback recovery, Sender-based message
logging, Checkpointing, Garbage collection
Title of the
Paper: Deriving Ontologies using
Multi-Agent Systems
DOWNLOAD FULL PDF
Authors:
Victoria Iordan, Antoanela Naaji, Alexandru Cicortas
Abstract: The complex systems are designed using multi-agent concepts. Agent
interaction is complex and requires appropriate models for a communication and
cooperation. Also the interaction between the users and the system agents must
be done in an efficient way. One of the basic conditions is that to use a
convenient "language", a common way of understanding. The ontology is the
appropriate concept that allows doing it. The operations on the ontologies
cover many of such requirements. Due to the complexity of systems interaction
that has an impact on the different ontologies used in them. Our model tries
to define a specific operation deriving an ontology form another one. The
competence descriptions in education are given as an application. The research
for this paper has been partial supported by the project PN II 91-047/2007.
Keywords:
Ontology, competence description, multi-agent systems
Title of the
Paper: Grid Computing Services for Parallel
Algorithms in Medicine and Biology
DOWNLOAD FULL PDF
Authors:
Dragos Arotaritei, Marius Turnea, Marius Cristian Toma, Radu Ciorap, Mihai
Ilea
Abstract: The compartmental models using differential equations are basic
models in epidemiology. The temporal evolution of spatial models for epidemic
spreading is suitable for parallelization and GRID services are solutions for
speeding the algorithms used in these models. We investigate several
computational aspects of parallel algorithms used in cellular automata model
and small world networks model. The four compartmental small world network
model of disease propagation (SEIR) is parallelized. In the second
application, we have found an asymptotic solution of zero degree for a
nonlinear differential parabolic equations system with a unique, small
parameter, in a cancerous disease model. We got a cancerous cells density
using simulation in three stages according with some system parameter value.
Grid service has been constructed for these numeric simulations.
Keywords:
Epidimiological models, cancerous cells density, parallel algorithms, high
performance computing
Title of the
Paper: A Framework to Identify the
‘Motivational Factors’ of Employees; A Case Study of Pakistan IT Industry
DOWNLOAD FULL PDF
Authors:
Muhammad Wisam Bhatti, Ali Ahsan, Ali Sajid
Abstract: Employee motivation is one of the key drivers of success in today’s
competitive environment. Relevant literature generally explains that motivated
employees can perform their tasks much better than demotivated workers. It is
due to this reason that there is always a requirement of a comprehensive
framework that should be able to provide complete guidelines with the help of
which supervisors* should be able to identify core factors that motivate
employees. Keeping in line the requirement stated earlier, this research paper
presents a self formulated† framework i.e. ‘Imperative Motivational Factors
Framework’ (IMFF). This proposed framework familiarizes necessary stakeholders
with the core motivational factors’ identification process. The framework
takes into account very generic factors identified from various motivational
theories, society and industry. Once the generic factors are identified then
the framework formulates specific factors for a group of employees and / or
for individual employee.
Keywords:
Employee Motivation, Framework, Case
Study, Pakistan’s
IT Industry, Satisfaction, Soft Issues
Issue 7, Volume 7, July 2008
Title of the Paper:
Object Oriented Implementation Monitoring Method of Zone Feature in Land
Consolidation Engineering Using SPOT 5 Imagery
DOWNLOAD FULL PDF
Authors:
Wei Su, Chao Zhang, Ming Luo, Li Li, Yujuang Wang, Zhengshan Ju, Daoliang Li
Abstract: Land consolidation is an effective activity realizing the
sustainable utilization of land use, and implementation monitoring of zone
type land consolidation engineering. Funded by National High Technology
Research and Development Program of China, an object oriented monitoring
method is produced in this research. Object correlation images(OCIs) are used
to measure if a zonal objects is consolidated (i. e., changed). There are
three correction parameters are used in this study: correlation, slope and
intercept in correction analysis process, and spectral and textural (4 Grey
Level Co-occurrence matrix (GLCM) features such as Homogeneity, Contrast,
Angular second moment, Entropy) information are used in caculation of objects
correction value. This approach consists in three phases: (1) multi-resulition
image segmentation, (2) correlation analysis of two phase remote sensing
images, and (3) implementation monitoring based on segmented correction
results. Firstly,the temote sensing images before and after land consolidation
are partitioned into objects using multi-resolution segmentation method.
Secondly, correlation analysis is done between these images. Finally, focused
on these regions, implementation monitoring is done based on the comparability
of image objects in the same area resulting from these two phase remote
sensing images. Accuracy assessment results indicate that this method can be
used to monitor land consolidation engineering implementation status, total
accuracy up to 86.30%.
Keywords:
Object oriented, land consolidation engineering, implementation monitoring,
object correlation images (OCIs), image segmentation, Fangshan district
Title of the Paper:
Texture feature Extraction for Land-Cover Classification of Remote Sensing
Data in Land Consolidation District Using Semi-Variogram Analysis
DOWNLOAD FULL PDF
Authors:
Yan Huang, Anzhi Yue, Su Wei, Daoliang Li, Ming Luo, Yijun Jiang, Chao Zhang
Abstract: The areas of the land consolidation projects are generally small, so
the remote sensing images used in land-cover classification for the land
consolidation are generally high spatial resolution images. The spectral
complexity of land consolidation objects results in specific limitation using
pixel-based analysis for land cover classification such as farmland, woodland,
and water. Considering this problem, two approaches are compared in this
study. One is the fixed window size co-occurrence texture extraction, and
another is the changeable window size according to the result of
semi-variogram analysis. Moreover, the methodology for optimizing the
co-occurrence window size in terms of classification accuracy performance is
introduced in this study. Zhaoquanying land consolidation project is selected
as an example, which located in Shunyi District,
Beijing, China;
texture feature is extracted from SPOT5 remote sensing data in the TitanImage
development environment and involved in classification. Accuracy assessment
result shows that the classification accuracy has been improved effectively
using the method introduced in this paper.
Keywords:
Semi-variogram, Land Consolidation, Texture feature, Classification
Title of the Paper: Toward
a System for Road Network Automatic Extraction in Land Consolidation using
High Spatial Resolution Imagery
DOWNLOAD FULL PDF
Authors:
Rui Guo, Ming Luo, Wei Su, Daoliang Li, Yijun Jiang, Zhengshan Ju, Jun Wang
Abstract: Land consolidation is a tool for increasing the area of the arable
land and improving the effectiveness of land cultivation. This paper presents
a practical system for automatic road extraction in land consolidation to
monitor the implementation of the project. The system integrates processing of
color image data and information from digital spatial databases, takes into
account context information, employs existing knowledge including plans of
land consolidation, rules and models, and treats each road subclass
accordingly. The system was designed as three-tier construction including
interface, modules and database. The prototype system has been implemented as
a stand-alone software package, and has been tested on a large number of
images in different land consolidation areas. The parallel line segments are
firstly detected and then the improved Active Contour Models (Snakes) are
introduced to link the extracted road segments to the whole networks. The
system was pilot used in the study area of Fangshan in Beijing which achieved satisfactory results.
Keywords:
system, road extraction, land consolidation, high spatial resolution imagery,
Snakes
Title of the Paper:
Comparation and Analysis Methods of Moderate -Resolution Satellite Remote
Sensing Image Classification
DOWNLOAD FULL PDF
Authors:
Jinli Chen, Ming Luo, Li Li, Daoliang Li, Chao Zhang, Yan Huang, Yijun Jiang
Abstract: Moderate resolution remote sensing images provide broad spectrum,
high spatial resolution, and rich texture information. However, most
traditional classification approaches are based exclusively on the digital
number of the pixel itself. Thereby only the spectral information is used for
the classification. But some researches have shown that pixel-based approaches
for classification of remotely sensed data are not very suitable for the
analysis of moderate resolution images.In order to get a reasonable planning
and effective management of land cover,the paper provide a new classification
and extraction method .In this paper, the object-oriented image classification
technology is used in the experiment of land cover information extraction for
CBERS-01 data, and compared with the results of the pixel-based approaches .
The results show that the Object-oriented technique is a more suited method
for moderate-resolution remote sensing image classification and a better
classification results.
Keywords:
Object-oriented, moderate-resolution, CBERS-01, land cover, classification
Title of the Paper: Spatial
Decision Support System for the Potential Evaluation of Land Consolidation
Projects
DOWNLOAD FULL PDF
Authors:
Xiaochen Zou, Ming Luo, Wei Su, Daoliang Li, Yijun Jiang, Zhengshan Ju, Jun
Wang
Abstract: Land consolidation is the basis of making the land arrangement
special plan, meanwhile, land consolidation sub area, ascertaining land
consolidation item and setting land consolidation indices are mainly depended
on land consolidation potentiality, so it is necessary to do this research. As
the most important pattern of land consolidation, potential evaluation of
cultivated land consolidation is more essential. However as far as theoretical
and empirical researches in
China’s mainland, few discuss on the
connotation and evaluation of cultivated land consolidation potentiality.
Facing present condition, in order to analyze potentiality of cultivated land,
some research is compiled in this paper. Nowadays, spatial decision support
system (SDSS) has been applied in variety of profession and domain not only in
the fundamental research but also in the concrete project application. SDSS
not only solves quantitative problems but also deals well with the uncertain,
fuzzy information. It can help decision-makers to make sensible decisions.
Facing the land consolidation problem, aiming at evaluating the potential of
land consolidation effectively, we developed a SDSS for evaluating potential
of land consolidation. In this research, land consolidation potentiality was
evaluated from the following four parts, potential of new effective area of
arable land, potential of improving productivity, potential of reducing
production costs and potential of improving the ecological environment. In
order to check the result of the evaluation, Fuzzy Assessment Model, Gray
Correlation Analysis Model and PPE model based on RGRA are adopted in this
SDSS. Through this study, we provided to the land managers and political
departments an approach that is scientifically sound and practical.
Keywords:
land consolidation, potential evaluation, model, SDSS
Title of the Paper:
Adaptive Kalman Procedure for SAR High Resolution Image Reconstruction in the
Planning Phase of Land Consolidation
DOWNLOAD FULL PDF
Authors:
Li Li, Ming Luo, Chao Zhang, Wei Su, Yijun Jiang, Daoliang Li
Abstract: Remote Sensing technologies provide the spatial data/maps and offer
great advantages for a land consolidation project. But sometimes in some
regions, optics and infrared remote sensing can not work well. SAR (Synthetic
aperture radar), an active microwave remote sensing imaging radar, has the
unique capabilities of obtaining abundant electromagnetic information from
ground objects all day/all night and all weather, and penetrating some special
objects and detecting the shapes of ground objects. At this point, SAR can
meet the requirement. However, for land consolidation application, high
spatial resolution SAR images are required. To increase the spatial resolution
of SAR images, this work presents a novel approximate iterative and recurrent
approach for image reconstruction, namely adaptive Kalman Filter (KF)
procedure. Mathematical models and Kalman equations are derived. The matched
filter and Kalman Filter are integrated to enhance the resolution beyond the
classical limit. Simulated results demonstrate that the method strongly
improves the resolution by using prior knowledge, which is a scientific
breakthrough in the case that the traditional pulse compression constrains the
improvement of SAR spatial resolution. And it is also shown that it is an
optimal method in the sense of mean square error and its computation cost is
lower than the traditional Kalman Filter algorithm.
Keywords:
Land, Agriculture, Synthetic Aperture Radar, Adaptive Kalman Filter, High
Resolution, Mean Square Error, Image Reconstruction
Title of the Paper:
Particle Swarm Optimization for Multiuser Asynchronous CDMA Detector in
Multipath Fading Channel
DOWNLOAD FULL PDF
Authors:
Jyh-Horng Wen, Chuan-Wang Chang, Ho-Lung Hung
Abstract: A multiuser detector for direct-sequence code-division
multiple-access systems based on particle swarm optimization (PSO) algorithm
is proposed. To work around potentially computational intractability, the
proposed scheme exploits heuristics in consideration of both global and local
exploration maximum likelihood (ML). Computer simulation demonstrates that the
proposed detector offers near-optimal performance with considerably reduced
computation complexity compared with that of existing sub-optimum detectors
are presented.
Keywords:
Code-division multiple access, particle swarm optimization, evolutionary
algorithm, multiuser detection.
Title of the Paper:
Load-Balance and Fault-Tolerance for Embedding a Complete Binary Tree in an
IEH with N-expansion
DOWNLOAD FULL PDF
Authors:
Jen-Chih Lin
Abstract: Embedding is of great importance in the applications of parallel
computing. Every parallel application has its intrinsic communication pattern.
The communication pattern graph is embedded in the topology of multiprocessor
structures so that the corresponding application can be executed. This paper
presents strategies for reconfiguring a complete binary tree in a faulty
Incrementally Extensible Hypercube (IEH) with N-expansion. This embedding
algorithm show a complete binary tree can be embedded in a faulty IEH with
dilation 4, load 1, and congestion 1 such that O(n2-h2) faults can be
tolerated, where n is the dimension of IEH and (h-1) is the height of a
complete binary tree. Furthermore, the presented embedding methods are
optimized mainly for balancing the processor loads, while minimizing dilation
and congestion as far as possible. According to the result, we can embed the
parallel algorithms developed by the structure of complete binary tree in an
IEH. This methodology of embedding enables extremely high-speed parallel
computation.
Keywords:
Incrementally Extensible Hypercube, Binary tree, Load-Balance,
Fault-Tolerance, Embedding
Title of the Paper:
Parallel Crawler Architecture and Web Page Change Detection
DOWNLOAD FULL PDF
Authors:
Divakar Yadav, Ak Sharma, J. P. Gupta
Abstract: In this paper, we put forward a technique for parallel crawling of
the web. The World Wide Web today is growing at a phenomenal rate. It has
enabled a publishing explosion of useful online information, which has
produced the unfortunate side effect of information overload. The size of the
web as on February 2007 stands at around 29 billion pages. One of the most
important uses of crawling the web is for indexing purposes and keeping web
pages up-to-date, later used by search engine to serve the end user queries.
The paper puts forward an architecture built on the lines of client server
architecture. It discuses a fresh approach for parallel crawling the web using
multiple machines and integrates the trivial issues of crawling also. A major
part of the web is dynamic and hence, a need arises to constantly update the
changed web pages. We have used a three-step algorithm for page refreshment.
This checks for whether the structure of a web page has been changed or not,
the text content has been altered or whether an image is changed. For The
server we have discussed a unique method for distribution of URLs to client
machines after determination of their priority index. Also a minor variation
to the method of prioritizing URLs on the basis of forward link count has been
discussed to accommodate the purpose of frequency of update.
Keywords: Divakar Yadav,
Ak
Sharma and J.P.Gupta
Title of the Paper:
Application Internet Multimedia on
Region Travel Route Information Establishment
DOWNLOAD FULL PDF
Authors:
Tingsheng Weng
Abstract: This research applies to Alishan area which is filled with unique
and rich culture resources as the case study area. Through the applications of
information technology and multimedia techniques, a website with the
functions, such as browsing an e-book and multi-languages, integrating
particular history, people, and nature environment resources aspects of the
traffic route information have been established in Chinese, English, and
Japanese versions. The purpose of this study aims to display the
characteristic of sufficient information of this area, and to attract tourists
to understand the relationship between the eco-environment, people and
geography, and industrial development. By leading people to familiarize
nature, the study also intends to achieve several goals, such as promoting
people’s concept of the environmental protection and increasing further
understanding of local natural information, and fostering local industry
simultaneously. The discussion of the paper is also in accordance with various
information and theories the browsers can search on net. The aim of website
design in this study takes digital education into concern and improve digital
life and e-learning. By means of the information abundance, the goal of this
paper is to enhance flourishing development of culture, travel marketing,
digital education value and society.
Keywords:
Multilanguage website, Traveling and Internet, 3D, International, Digital
life, e-Learning, flow
Title of the Paper:
Constraint Satisfaction Problems Solved by Semidefinite Relaxations
DOWNLOAD FULL PDF
Authors:
Mohamed Ettaouil, Chakir Loqman
Abstract: We consider the constraint satisfaction problem (CSP), where the
values must be assigned to variables which are subject to a set of
constraints. This problem is naturally formulated as 0-1 quadratic knapsack
problem subject to quadratic constraint. In this paper, we present a
branch-and-bound algorithm for 0-1 quadratic programming, which is based on
solving semidefinite relaxations. At each node of the enumeration tree, a
lower bound is given naturally by the value of (SDP) problem and an upper
bound is computed by satisfying the quadratic constraint. We show that this
method is able to determine whether a (CSP) has a solution or not. Then we
give some hints on how to reduce as much as possible the initial size of the
(CSP). Some numerical examples assess the effectiveness of the theoretical
results shown in this paper, and the advantage of the new modelization.
Keywords:
Constraint satisfaction problem, 0-1 Quadratic knapsack problem, SDP
relaxation, Branch-and-bound, Filtering algorithms.
Title of the Paper:
Design and Implementation of a Cipher System (LAM)
DOWNLOAD FULL PDF
Authors:
Panagiotis Margaronis, Emmanouil Antonidakis
Abstract: This paper presents the design and implementation of a digital
integrated encryption/decryption circuitry called LAM which is based on
Peripheral Component Interconnect (PCI) Architecture for a Personal Computer
(PC) communication card. The implementation of a hardware PC cryptography card
has been designed using a Field Programmable Gate Array (FPGA) chip in
combination with the PCI Bus. The main objective of this paper is to provide
the reader with a deep insight of the design of a digital cryptographic
circuit, which was designed for a FPGA chip with the use of Very (High-Speed
Integrated Circuit) Hardware Description Language (VHDL) for a PCI card. A
demonstration of the LAM circuitry will be presented. To see the effect of the
LAM cryptography in the operation of the card, it was also simulated and
analyzed. The Simulations were run under various conditions, which are
applicable to most PCI applications.
Keywords:
Hardware, Security, Communication, Computer, Design, Architecture
Title of the Paper:
Optimizing Energy Consumption of Data Flow in Mobile
Ad Hoc Wireless Networks
DOWNLOAD FULL PDF
Authors:
Haiyang Hu, Hua Hu
Abstract: Because of the limited node battery power, energy optimization is
important for the mobile nodes in wireless ad hoc networks. Controlled node
mobility is an effective approach to reduce the communication energy
consumption while the movement itself also consumes energy. Based on
cooperative communication (CC) model, this paper takes the energy consumption
of node movement and their individual residual energy into account, and then
proposes localized algorithms for directing mobile node mobility and adapting
their transmission power dynamically in mobile ad hoc networks. Compared with
other algorithms by simulation, our mechanism named DEEF shows its efficiency
in improving the system lifetime.
Keywords:
ad hoc networks; node mobility; cooperative communication; energy consumption;
multi-flow; local algorithm
Title of the Paper:
Video-based Wireless Sensor Networks Localization Technique Based on Image
Registration and SIFT Algorithm
DOWNLOAD FULL PDF
Authors:
Daniela Fuiorea, Vasile Gui, Dan Pescaru, Petronela Paraschiv, Istin Codruta,
Daniel Curiac, Constantin Volosencu
Abstract: Localization discovery techniques are required by most Wireless
Sensor Networks applications. Moreover, in case of video surveillance,
localization includes not only spatial coordination but also cameras direction
and video-field overlap estimation. Accurate methods involve expensive and
power consuming hardware. This paper presents an efficient node localization
including video-field overlap estimation that employs image registration in
order to align images quasi-simultaneously acquired from different video
sensors. A SIFT algorithm is used to discover common features between pairs of
images. Experimental and simulation evaluation shows the estimation accuracy
comparing with a manual approach.
Keywords:
wireless sensor networks, topology, localization, deployment, image
processing, image registration, SIFT
Title of the Paper:
Measuring with Ultra-Low Power Wi-Fi Sensor Tags in LabVIEW
DOWNLOAD FULL PDF
Authors:
Tom Savu, Marius Ghercioiu
Abstract: G2 Microsystems of Campbell, California, USA, released in 2007 the
first ever ultra-low power Wi-Fi System on a Chip (SoC) named G2C501. This SoC
includes a 32-bit CPU, crypto accelerator, real-time clock and a versatile
sensor interface that can serve as a standalone host subsystem. The G2C501
goes beyond today’s basic radio frequency identification (RFID) technology to
offer intelligent tracking and sensor capabilities that leverage IEEE 802.11
(Wi-Fi) networks. Due to its support for multiple location technologies, small
form factor and ultra-low power consumption, the G2C502 SoC can be integrated
into Wi-Fi sensor tags that lower cost of ownership and meet the needs of a
variety of industries including consumer electronics, pharmaceuticals,
chemical manufacturing, cold chain and more.
A battery powered, small size ultra low-power Wi-Fi wireless measurement node
name IP Sensor has been built using the G2C501 SoC. Sensors for measurement of
temperature, humidity, light, and vibration or motion are currently mounted on
the IP Sensor board. The node is able to read a sensor and send data to the
network by using an IP-based application protocol such as UDP.
This paper describes the new IP Sensor device and gives a programming
methodology using LabVIEW.
Keywords:
Wi-Fi sensors, System on a Chip (SoC), ultra-low power, LabVIEW
Title of the Paper:
Business Intelligence Applications for University Decision Makers
DOWNLOAD FULL PDF
Authors:
Carlo Dell'Aquila, Fransesco Di Tria, Ezio Lefons, Filippo Tangorra
Abstract: The development of a Business Intelligence Application starts with
the execution of OLAP queries to extract information from data stored in a
Data Warehouse. The results of these queries, together with an opportune data
representation, offer a deep synthesis of data and help business users to
better discover hidden knowledge than using conventional database queries.
Nowadays, information discovery is getting more and more important also in
academic environments, because internal evaluation teams have to provide the
guidelines to improve the quality of both Didactics and Research. In this
paper, we present an approach to implement Business Intelligence technology in
the University context. The architecture of the data warehouse and examples of
analytic queries for the didactics management are also presented and
discussed.
Keywords:
Data warehouse (DW), Data mart (DM), Online analytical processing (OLAP),
Academic application.
Title of the Paper: A
Tool for the Creation and Management of Level-of-Detail Models for 3D
Applications
DOWNLOAD FULL PDF
Authors:
Oscar Ripolles, Francisco Ramos, Miguel Chover, Jesus Gumbau, Ricardo Quiros
Abstract: Real-time visualization of 3D scenes is a very important feature of
many computer graphics solutions. Current environments require complex scenes
which contain an increasing number of objects composed of thousands or even
millions of polygons. Nevertheless, this complexity poses a problem for
achieving interactive rendering. Among the possible solutions, stripification,
simplification and level of detail techniques are very common approaches to
reduce the rendering cost. In this paper, we present set of techniques which
have been developed for offering higher performance when rendering 3D models
in real-time applications. Furthermore, we also present a standalone
application useful to quickly simplify and generate multiresolution models for
arbitrary geometry and for trees.
Keywords:
Computer graphics, level of detail, simplification, triangle strips,
multiresolution, tree rendering
Title of the Paper:
Using Computer-aided Techniques in the Dynamic Modeling of the Human Upper
Limb
DOWNLOAD FULL PDF
Authors:
Antoanela Naaji
Abstract: The wide mobility of human body leads to the necessity of modeling
the osteoarticular system as a mechanism with a large number of degrees of
freedom. Dynamic modeling of osteoarticular system is necessary because the
exertion of various actions and natural physiological movements are
essentially dynamic. Very often we use a simplified model because the
phenomena produced are so complex that accurate mathematical reproduction is
practically impossible. A dynamic model must provide a good estimate of total
weight and mass distribution as well as transmissibility and amortization
proprieties for bones, muscles, joints, blood and skin.
The paper presents a dynamic functional model considering the human upper limb
as a mechanic system with 5 degrees of freedom in the case the segments are
moved by their own weight forces. The bones and the muscles were modeled in
Solid Works, the model of the upper limb obtained being very close as form to
the real one. Based on this model, the calculus of mass proprieties was made.
The differential equations of motion obtained were solved using Lagrange
formalism.
Keywords:
Modeling, Human upper limb, Dynamic model, Osteoarticular system
Title of the Paper:
Building Three Dimensional Maps based on Cellular Neural Networks
DOWNLOAD FULL PDF
Authors:
Michal Gnatowski, Barbara Siemiatkowska, Rafal Chojecki
Abstract: One of the main problem in robotics is map building. In the article
3D map building based on 3D laser data is presented. The data is obtained from
a SICK laser mounted on a rotating support what gives a 3D representation of
the scene. The map is divided into cells and each cell represents a certain
area of the scene and keeps a list of objects.
This is a real-time system, which consumes little computer memory and works
properly in indoor environment.
Keywords:
Modeling, Human upper limb, Dynamic model, Osteoarticular system
Title of the Paper:
Software Simulation for Femur Fractures in Case of Frontal Car Accidents
DOWNLOAD FULL PDF
Authors:
Cris Precup, Antoanela Naaji, Csongor Toth, Arpad Toth
Abstract: According to a study carried out by one of the authors at the
County Clinical
Emergency
Hospital
in Arad, over
40% of the accidents resulting in femur fracture take place on roads and
highways.
This work briefly presents these results, and on the basis of bibliographical
data regarding the mechanical behavior of bones and the classification of
femur fractures occurring in expert literature, a simulation program of these
fractures in the case of car accidents.
The aim of the program is to determine the pressure at which the femur is
fractured in different conditions of mass and speed of the vehicle, braking
distance, age of the victim, femur size, etc. The use of this program
represents an exemplification method, as an alternative to direct expertise at
the site.
Keywords:
Simulation, Model, Femur, Biomechanics, Fractures
Title of the Paper:
Robot Mapping and Map Optimization Using Genetic Algorithms and Artificial
Neural Networks
DOWNLOAD FULL PDF
Authors:
Inci Cabar, Sirma Yavuz, Osman Erol
Abstract: This paper is about an autonomous robot map creation and
optimization algorithm. To create the map, calibrated sensor data transfered
to the x-y coordinate system were used. Afterwards we tried to otimize the
anomalies of the map with different methods as artificial neural networks,
genetic algorithms.
Keywords:
Robot Map, Genetic Algorithm, Artificial Neural Networks, Autonomous Robot,
Three-Wheeled Robot, Map Optimization, DBSCAN, Robot Kinematics, Sensor
Calibration, Mapping
Title of the Paper:
Modern Approaches in Detection of Page Separators for Image Clustering
DOWNLOAD FULL PDF
Authors:
Costin-Anton Boiangiu, Dan-Cristian Cananau, Andrei-Cristian Spataru
Abstract: This paper describes a model for detecting all types of separators
on a document page and combining the results towards obtaining elements
clusters on every document image page. The separators are determined by using
various methods as for example the Delaunay triangulation. The physical
layouts of the documents are always hard to extract, but determining the
simplest separators found in most documents is the starting point for correct
layout detection.
Keywords:
automatic content conversion, Delaunay triangulation, separator detection,
line detection, contour points, neighborhood relations, white-space detection,
page clustering
Title of the Paper:
Methods of Bitonal Image Conversion for Modern and Classic Documents
DOWNLOAD FULL PDF
Authors:
Costin-Anton Boiangiu, Andrei-Iulian Dvornic
Abstract: Bitonal conversion is a basic preprocessing step in Automatic
Content Analysis, a very active research area in the past years. The
information retrieval process is performed usually on black and white
documents in order to increase efficiency and use simplified investigation
techniques. This paper presents a number of new modern conversion algorithms
which are aimed at becoming an alternative to current approaches used in the
industry. The proposed methods are suitable for both scanned images and
documents in electronic format. Firstly, an algorithm consisting of a contrast
enhancement step, followed by a conversion based on adaptive levelling of the
document is presented. Then a new multi-threshold technique is suggested as a
solution for noise interferences, a common feature of scanned books and
newspapers. Finally, three more approaches adapted to the particular
properties of electronic documents are introduced. Experimental results are
given in order to verify the effectiveness of the proposed algorithms.
Keywords:
automatic content analysis, electronic documents, bitonal conversion,
information retrieval, noise, scanned images, contrast enhancement
Title of the Paper:
Normalized Text Font Resemblance Method Aimed at Document Image Page
Clustering
DOWNLOAD FULL PDF
Authors:
Costin-Anton Boiangiu, Andrei-Cristian Spataru, Andrei-Iulian Dvornic,
Dan-Cristian Cananau
Abstract: This paper describes an approach towards obtaining the normalized
measure of text resemblance in scanned images. The technique, aimed at
automatic content conversion, is relying on the detection of standard
character features and uses a sequence of procedures and algorithms applied
sequentially on the input document. The approach makes use solely of the
geometrical characteristics of characters, ignoring information regarding
context or the character-recognition.
Keywords:
automatic content conversion, text characteristics, font size, boldness,
italic, texture measurements
Title of the Paper:
Software Simulation for Femur Fractures in Case of Frontal Car Accidents
DOWNLOAD FULL PDF
Authors:
Cris Precup, Antoanela Naaji, Csongor Toth, Arpad Toth
Abstract: According to a study carried out by one of the authors at the
County Clinical
Emergency
Hospital
in Arad, over
40% of the accidents resulting in femur fracture take place on roads and
highways.
This work briefly presents these results, and on the basis of bibliographical
data regarding the mechanical behavior of bones and the classification of
femur fractures occurring in expert literature, a simulation program of these
fractures in the case of car accidents.
The aim of the program is to determine the pressure at which the femur is
fractured in different conditions of mass and speed of the vehicle, braking
distance, age of the victim, femur size, etc. The use of this program
represents an exemplification method, as an alternative to direct expertise at
the site.
Keywords:
Simulation, Model, Femur, Biomechanics, Fractures
Issue 8, Volume 7, August 2008
Title of the Paper: A
Synopsis of Sound - Image Transforms based on the Chromaticism of Music
DOWNLOAD FULL PDF
Authors:
Dionysios Politis, Dimitrios Margounakis
Abstract: Numerous algorithms have been proposed for image to sound and sound
to image transforms. Based on heuristic assignments of sound frequencies to
RGB values, they perform effective mappings between musical entities (notes)
and values for optical wavelengths. In this paper, the chromaticism of the
sound domain is taken into account and the coloring of the performed music is
depicted for interrelated sound segments that have acoustic polymorphism and
form entities of pitch organization. This algorithm helps visualize musical
files according to their melodic structure and not only according to their
temporal characteristics (i.e. rhythm).
Keywords:
Sound to Image Transforms, Visualizations, Chromatic Index of Music.
Title of the Paper:
Laboratory Integrated Schema
DOWNLOAD FULL PDF
Authors:
Panagiotis Kalagiakos
Abstract: The IHE initiative for clinical laboratories defines an architecture
that promotes understanding, usability and reusability among the laboratories.
The primary goal is to ensure that medical information is correct and on time
to the healthcare professionals. The IHE attempt does not define new standards
but uses well established existing standards like HL7, ASTM, DICOM, ISO, IETF,
OASIS, CLSI in a strict framework and defines specific implementations of the
standards to achieve integration goals of clinical laboratories. The
functional components of a healthcare system are called Actors. Actors
interact through transactions. The schema produced by IHE is based on the
notion of Integration Profile and is comprised of Actors. The goal of this
paper is to present the clinical laboratories integration profiles as a
sequence of transactions between actors. The set of Workflow Integration
Profiles involving clinical laboratories, clinical wards and other teams
within healthcare institutions to fully integrate diagnostic testing on in
vitro specimens are presented together with a Content Integration Profile; the
Content Integration Profile enables clinical laboratories (local, regional or
national) as well as standalone laboratories to share their result reports.
Keywords:
LTW, LDA, LPOCT, LCSD
Title of the Paper:
Characterization of Ag-PEO10LiCF3SO3-PolyPyrrole-Au
Neural Switch
DOWNLOAD FULL PDF
Authors:
Mahmoud Z. Iskandarani
Abstract: The design, build and characterization of a semiconducting, organic
neural switch are carried out.
Voltage-Current characteristics for an ionic-electronic interacting neural
switch are presented. Neural Switch characteristics analyzed using an
electronic equivalent circuit model. The model is proved to describe to a
great degree of soundness the interacting mechanism between the used
materials. Programmability of the switch is proved to be bi-directional and
reversible with hysterisis effect which is due to excess charge storage, with
a behavior similar to a biological Synapse.
Keywords:
Neural, modeling, Synapse, Memory, Information Processing, Polymers,
Computing, Intelligence
Title of the Paper:
Anti-Counterfeit Ownership Transfer Protocol for Low Cost RFID System
DOWNLOAD FULL PDF
Authors:
Chin-Ling Chen, Yu-Yi Chen, Yu-Cheng Huang, Chen-Shen Liu, Chia-I Lin,
Tzay-Farn Shih
Abstract: Radio Frequency Identification (RFID) is a new technology. In recent
years, it is convenient and feasible in many applications. However, it also
addresses many security issues which are worth discussing. The Counterfeit
imposes a menace to industry worldwide, and the problem is not specific for
certain products or countries. In 2003, Koh et al. describe a RFID system
based on “track and trace” solution to apply into pharmaceutical supply chain
management to fight the counterfeit. Moreover, there are applications to solve
malicious manner were presented. But there always still existed some disputes
and not conform Class 1 Generation 2 (C1G2) standards. Unfortunately, the
trick is changeable. The Koh et al.’s scheme is at premise rather primitive.
In order to tackle this problem, we propose an anti-counterfeit ownership
transfer protocol for low cost RFID system. We only use a tag to be a storage
media. The proposed scheme can ensure a secure transaction.
Keywords:
security; digital signature; authentication; Anti- counterfeit; ownership
transfer; RFID; EPC
Title of the Paper: DDoS
Attacks Detection Model and its Application
DOWNLOAD FULL PDF
Authors:
Muhai Li, Ming Li, Xiuying Jiang
Abstract: With the proliferation of Internet applications and network-centric
services, network and system security issues are more important than before.
In the past few years, cyber attacks, including distributed denial-of-service
(DDoS) attacks, have a significant increase on the Internet, resulting in
degraded confidence and trusts in the use of Internet. However, the present
DDoS attack detection techniques face a problem that they cannot distinguish
flooding attacks from abrupt changes of legitimate activity. In this paper, we
give a model for detecting DDoS attacks based on network traffic feature to
solve the problem above. In order to apply the model conveniently, we design
its implementation algorithm. By using actual data to evaluate the algorithm,
the evaluation result shows that it can identify DDoS attacks.
Keywords:
Algorithm, Attack, Application, DDoS, Detection, Modal
Title of the Paper:
Topics Related with the Wind Turbine
DOWNLOAD FULL PDF
Authors:
Jose De Jesus Rubio-Avila,
Andres Ferreyra-Ramirez, Fernando Baruch, Santillanes-Posada, Martin
Salazar-Pereyra, Genaro Deloera-Flores
Abstract: In this paper we present the modeling of a wing turbine, using the
Euler Lagrange method and circuits theory. We get the mathematical equation
(modeling) that describes the wind turbine and we simulate it using the
mathlab program. We leave this result as a contribution and an opened line to
continue the researching in the field of the renewable energies in our
country.
Keywords:
Wind turbine, Modeling, Simulation.
Title of the Paper: The
Clustering Algorithm for Nonlinear System Identification
DOWNLOAD FULL PDF
Authors:
Jose De Jesus Rubio Avila,
Andres Ferreyra Ramirez, Carlos Aviles-Cruz, Ivan Vazquez-Alvarez
Abstract: A new on-line clustering fuzzy neural network is proposed. In the
algorithm, structure and parameter learning are updated at the same time.
There is not difference between structure learning and parameter learning. It
generates groups with a given radius. The center is updated in order to get
that the center is near to the incoming data in each iteration, in this way,
It does not need to generate a new rule in each iteration, i.e., it does not
generate many rules and it does not need to prune the rules.
Keywords:
Clustering algorithm, Fuzzy systems, Modeling, Identification.
Title of the Paper:
Using Genetic Algorithms to Find New Color Transformations for JPEG-2000
Image Compression
DOWNLOAD FULL PDF
Authors:
Mohammed S. Al-Rawi, Abdel-Latif Abu-Dalhoum, Yousef Salah, Wesam
Al-Mobaideen, Ansar Khoury
Abstract: Image compression techniques play an important role in multimedia
applications. The JPEG-2000 which is based on the wavelet transform is a
promising image compression technique expected to replace the current discrete
cosine transform based compression known as JPEG. In this paper, genetic
algorithms are used to optimize the coefficients of the RGB to YCbCr color
transform used in JPEG-2000 and to find alternate color transforms. Color
transformation in JPEG-2000 is an early phase of the compression process
intended to improve the compression performance. The matrix elements are
optimized using fitness functions based on the root mean square error between
the original and the reconstructed image. The resultant color transformations
revealed an enhancement in the JPEG-2000 codec by.
Keywords:
Genetic Algorithms, Irreversible Color Transform, Reversible Color Transform,
JPEG2000, Image Compression, Wavelet Transform, ITU-R, jasper.
Title of the Paper: Grid
Workflows Specification and Verification
DOWNLOAD FULL PDF
Authors:
P. Kurdel, J. Sebestyenova
Abstract: Grids are being adopted and developed in several scientific
disciplines that deal with large-scale collaboration, massive distributed
data, and distributed computing problems. The service orchestration is a
problem of making multiple services coordinate themselves and communicate in
an orderly fashion so as to accomplish a task more complex than the single
tasks provided by the individual composing services. Composition is devoted to
the aim of connecting services in a collaborative fashion. A Grid workflow
system is a type of application-level Grid middleware that is supposed to
support modelling, redesign and execution of large-scale processes. A grid
workflow can be represented by a grid workflow graph, where nodes correspond
to activities and edges correspond to dependencies between activities, called
flows. Verification is usually based on an extension of a kind of formal
method. Grid workflow verification and validation must be conducted so that we
can identify any violations of the correctness in workflow specification and
consequently remove them in time.
Keywords:
Distributed computing, Grid infrastructure, Service orchestration, Workflow
management system, Verification, Web portal
Title of the Paper:
Better Learning of Supervised Neural Networks Based on Functional Graph – An
Experimental Approach
DOWNLOAD FULL PDF
Authors:
Joseph Raj V.
Abstract: Multilayered feed forward neural networks possess a number of
properties which make them particularly suited to complex problems. Neural
networks have been in use in numerous meteorological applications including
weather forecasting. As Neural Networks are being more and more widely used in
recent years, the need for their more formal definition becomes increasingly
apparent. This paper presents a novel architecture of neural network models
using the functional graph. The neural network creates a graph representation
by dynamically allocating nodes to code local form attributes and establishing
arcs to link them. The application of functional graph in the architecture of
Electronic neural network and Opto-electronic neural network is detailed with
experimental results. Learning is defined in terms of functional graph. The
proposed architectures are applied in weather forecasting and X-OR problem.
The weather forecasting has been carried out based on various factors
consolidated from meteorological experts and documents. The inputs are
temperature, air pressure, humidity, cloudiness, precipitation, wind
direction, wind speed, etc., and outputs are heavy rain, moderate rain and no
rain. The percentage of correctness of the weather forecasting of the
conventional neural network models, functional graph based neural network
models and the meteorological experts are compared.
Keywords:
Back propagation, Convergence, Functional Graph, Neural network, Optical
neuron, Weather forecasting
Title of the Paper: GSM
Mobile SMS/MMS using Public Key
Infrastructure: m-PKI
DOWNLOAD FULL PDF
Authors:
Nor Badrul Anuar, Lai Ngan Kuen, Omar Zakaria, Abdullah Gani, Ainuddin Wahid
Abdul Wahab
Abstract: Presently, mobile handheld device has successfully replaced
traditional telephone to become the most popular wireless communication tools.
Mobile Short Message Service (SMS) and Multimedia Message Service (MMS)
fulfill almost all the user requirements as an effective communication and
information delivering service. Since SMS/MMS become so popular on daily
communication, there is a demand to communicate or exchange confidential
information in a secure environment. Public Key Infrastructure (PKI) is a
proven solution, which using pairing of key, for secure communication
encryption. In this paper, m–PKI is introduced to provide PKI encryption to
the mobile SMS and MMS. This new approach allows the end-user to send private
and classified message via SMS. The key pair generation and distribution are
performed by the Certificate Authority (CA). The size of the key pair are
studied and decided by the tradeoff between performance and security.
Keywords:
cryptography, message classification, PKI, SMS/MMS, RSA
Title of the Paper:
Software Project Management: New Analysis of Specific Engineering and
Management Processes in Modern Graphical Printing Industry Production
DOWNLOAD FULL PDF
Authors:
Vladimir Simovic
Abstract: Proper and effective Software Project Management (SPM) is usually
the most important factor in the outcome of a project for many companies in
“modern graphical printing industry production” (MGPIP) and their project
engineers and managers [7]. Article opens new analytical directions for proper
management of the software project in MGPIP, which can be one of the important
reasons for success. By using effective project management techniques a
project manager can improve the chances of success in graphical printing
production. Problem is how to analytically combine these techniques into a
practical and best effective workable process? For effective solution one need
a balanced process that covers the management and production of the entire
project from inception to completion. This work proposes that only few
simulation models (and stochastic simulations) can analytically (and on
scientific research basis) solve specific management and engineering
organisational, controlling and monitoring problems of SPM in MGPIP. Mentioned
simulation models are bases for simulations of components of whole graphical
production process, from digital records arrivals to finished printing plate.
Why to simulate components of whole graphical production process, from digital
records arrivals to finished printing plate? Possible important answer is that
nowadays MGPIP is in a time of big changing of (especially “mass printing”)
production technology in a way of to integration traditional printing with
digital printing, and moving in the space of digital printing for internet,
intranet and for wide web systems usage [5]. This research solve some
practical problems of MGPIP in accordance with the stated before, and this
paper shows the result of a scientific comparison of the existing practical
systems that function in different ways in MGPIP of famous Croatian printing
house “Vjesnik” Zagreb and largest Croatian (daily news) publishing house EPH
(“Europapress Holding”) Zagreb, but whose main aim is offering solution
contents, knowledge, information, etc., and to reach as many users as possible
by means of this and similar solutions. This research was part of main
Scientific research named “Analytical Model for Monitoring of New Education
Technologies for Long Life Learning“ conducted by Ministry of Science,
Education and Sports of the
Republic of Croatia
(Registered Number 227-2271694-1699).
Keywords:
Software Project Management, Stochastic Simulation, Modern Graphical Printing
Industry Production
Title of the Paper:
Developing Multi-User Online Games with Agents
DOWNLOAD FULL PDF
Authors:
Agostino Poggi
Abstract: This paper presents a software framework able to extend the coverage
of virtual communities across the boundaries of the everyday Internet by
providing ubiquitous and integrated access to applications regardless of the
users connecting though conventional, fixed-network computers, mobile,
wireless-network devices or even interactive digital television set-top boxes.
This software framework is an extension of the JADE multiagent development
environment and has been experimented for the realization of a special kind of
virtual community applications, i.e., multi-user online games.
Keywords:
Multi-agent systems, Mobile devices, DTV, Multi-games, Java.
Title of the Paper: A
Fast and Robust Method for the Identification of Face Landmarks in Profile
Images
DOWNLOAD FULL PDF
Authors:
Andrea Bottino, Sandro Cumani
Abstract: Several applications in Computer Vision, like recognition,
identification, automatic 3D modeling and animation and non conventional human
computer interaction require the precise identification of landmark points in
facial images. Here we present a fast and robust algorithm capable of
identifying a specific set of landmarks on face profile images. First, the
face is automatically segmented from the image. Then, the face landmarks are
extracted. The algorithm is based on the local curvature of the profile
contour and on the local analysis of the face features. The robustness of the
presented approach is demonstrated by a set of experiments were ground truth
data are compared with the result of our algorithm. A percentage of 92%
correct identification and a mean error of 3.5 pixels demonstrate the
robustness of the approach, which is of paramount importance for several
applications.
Keywords:
silhouettes, profile images, face landmark, robust identification
Title of the Paper:
Hierarchical Localization Strategy for Wireless Sensor Networks
DOWNLOAD FULL PDF
Authors:
Tzay-Farn Shih, Wei-Teng Chang
Abstract: Wireless Sensor Network (WSN) is an emerging network technology.
Among the varying research focused upon WSN, Location-Aware is a topic worthy
of study. We are putting forward a hierarchical localiza-tion strategy. By
using wireless localization and a few GPS, sensor networks can be more
economical and localization can be more accurate. Our strategy includes two
aspects: first, getting a relatively good measure-ment error value from the
survey distance between two nodes by way of simple statistics and then
correcting the survey distance to get a more accurate node location; second,
using relaying nodes to fulfill hierarchical positioning strategy. The
advantage of these methods is that they can reduce cost in the considerable
extra hardware common in sensor networks. Through simulation experiment, the
evidence confirmed the proposed methods can effectively improve localization
accuracy and enhance the localization rate of estimated nodes.
Keywords:
Wireless Sensor Network; GPS; Wireless Ad Hoc network; AOA; TOA; TDOA
Title of the Paper:
Location-Based Multicast Routing Protocol for Mobile
Ad Hoc Networks
DOWNLOAD FULL PDF
Authors:
Tzay-Farn Shih, Chao-Cheng Shih, Chin-Ling Chen
Abstract: Wireless network offers freedom moving around the effective
transmission area and the flexibility and easy to use function for Internet
application. Many applications of computer network involve multiple users that
will rely on the ability of the network to provide multicast services. Thus,
multicasting will be concerned as an essential part of Ad Hoc networks. Some
of the proposed routing algorithms require maintaining a global network state
at each node, the imprecision of global state and the large amount of storage
and communication overhead induce poor scalability. In this paper, we propose
a distributed cluster-based QoS multicast routing algorithm which only
requires maintaining a local state at each node. The location information
provided by positioning device is aided in route discovery and route
maintenance procedure. Our protocol partitions the network into square
clusters. In each cluster, a cluster head and gateways are selected by a
cluster head selection algorithm and a gateway selection algorithm
respectively. After the construction of cluster heads and gateway nodes, a
distributed computation collectively utilizes the local state information to
construct multicast tree in a hop-by-hop basis. Simulations are conducted to
evaluate the performance of our algorithm. As it turns out, our protocol has
better performance and lower routing overhead than the non-cluster based
algorithm.
Keywords: Mobile ad hoc networks, Multicasting, QoS, GPS,
Proactive routing, Reactive routing
Title of the Paper:
Performance and Reliability Analysis of New Fault-Tolerant Advance Omega
Network
DOWNLOAD FULL PDF
Authors:
Rita Mahajan, Renu Vig
Abstract: The performance and fault tolerance are two very crucial factors in
designing interconnection networks for a multiprocessor system. A new type of
MIN, Fault Tolerant Advanced Omega network, which using 4×4 switches, is
proposed on the basis of the idea of the Delta network and the Omega network.
In this paper, it is proved that 4×4 switches have the better performance/cost
ratios than 2×2 switches based on the current level of the VLSI technology.
This paper expounds its topological properties and routing algorithm and makes
performance/cost ratios comparisons. The mathematical analysis approach is
used here to find the probability of acceptance and bandwidth with change in
traffic. Furthermore reliability analysis of Fault-Tolerant Advanced Omega
Network (FTAON) is discussed in detail, It is seen that FTAON is more reliable
and cost effective than other previously proposed MINs of similar class. It
has been also observed that it has fault tolerant and nonblocking capability
in complex parallel systems.
Keywords:
Interconnection Network, Advanced Omega Network, Fault-Tolerance, Reliability.
Title of the Paper:
Performance Review of Taiwanese IC Design Industry: DEA-based Malmquist
Productivity Measure
DOWNLOAD FULL PDF
Authors:
Wei-Hwa Pan, Yuan-Yao Feng, Yueh-Chuen Huang, Yan-Kwang Chen
Abstract: The total revenue of Taiwan's
IC design industry is now the second in the world, only behind the United States.
To keep pace with abroad leaders, continually innovating by the IC design
companies to main-tain and enhance their performance is the most import for
obtaining the sustainable competitive advantage. This paper is concerned with
a study on exploring the performance of Taiwan's IC design industry,
including the managerial and productive technical efficiencies and their
change over time. Data envelopment analysis (DEA)-based Malmquist method was
employed to analyze the financial and non-financial data of 72 companies, from
the financial panel listed in Taiwan Stock Exchange market, and examine the
performance of these com-panies over the period from 2003 to 2005.
Accordingly, IC design companies can recognize which function is important to
their performance and which function can be further improved to achieve
competitive advantage in the industry.
Keywords:
Data envelopment analysis, Malmquist productivity, Efficiency, IC design
industry
Title of the Paper:
T-Detector Maturation Algorithm with Overlap Rate
DOWNLOAD FULL PDF
Authors:
Jungan Chen, Wenxin Chen, Feng Liang
Abstract: A parameter called overlap rate is proposed to control the number of
valid detectors generated for a T-detector Maturation Algorithm. The achieved
algorithm TMA-OR can reduce the number of detectors for abnormal detection.
Experiment results show that TMA-OR is more effective than V-detector
algorithms such as naive estimate and hypothesis testing method and can be
applied on different data sets.
Keywords:
Artificial immune system, overlap rate, match range
Title of the Paper:
StegCure: A Comprehensive Steganographic Tool using Enhanced LSB Scheme
DOWNLOAD FULL PDF
Authors:
L. Y. Por, W. K. Lai, Z. Alireza, T. F. Ang, M.T. Su, B. Delina
Abstract: Protected and encrypted data sent electronically is vulnerable to
various attacks such as spyware and attempts in breaking and revealing the
data. Thus, steganography was introduced to conceal a secret message into an
unsuspicious cover medium so that it can be sent safely through a public
communication channel. Suspicion becomes the significant key determinant in
the field of steganography. In other words, an efficient stegnographic
algorithm will not cause any suspicion after the hidden data is embedded. This
paper presents an overview of steganography on GIF image format in order to
explore the potential of GIF in information hiding research. A platform,
namely StegCure is proposed by using an amalgamation of three different Least
Significant Bit (LSB) insertion algorithms that is able to perform
steganographic methods. This paper explains about the enhancement of the Least
Significant Bits (LSB) insertion techniques from the most basic and
conventional 1 bit to the LSB colour cycle method. Various kinds of existing
steganographic methods are discussed and some inherent problems are
highlighted along with some issues on existing solutions. In comparison with
the other data hiding applications, StegCure is a more comprehensive security
utility where it offers user-friendly functionality with interactive graphic
user interface and integrated navigation capabilities. Furthermore, in order
to sustain a higher level of security, StegCure has implemented a Public Key
Infrastructure (PKI) mechanism at both sender and receiver sites. With this
feature, StegCure manages to restrict any unauthorized user from retrieving
the secret message through trial and error. Besides, we also highlight a few
aspects in LSB methods on image steganography. At the end of the paper, the
evaluation results of the hybrid method in StegCure are presented. The future
work will be focused in assimilation of more diversified methods into a whole
gamut of steganography systems and its robustness towards steganalysis.
Keywords:
steganography, GIF, security, information hiding, least significant bit, LSB.
Title of the Paper:
Orthogonal Array
Application for Optimal Combination of Software Defect Detection Techniques
Choices
DOWNLOAD FULL PDF
Authors:
Ljubomir Lazic, Nikos Mastorakis
Abstract: In this paper, we consider a problem that arises in black box
testing: generating small test suites (i.e., sets of test cases) where the
combinations that have to be covered are specified by input-output parameter
relationships of a software system. That is, we only consider combinations of
input parameters that affect an output parameter, and we do not assume that
the input parameters have the same number of values. To solve this problem, we
propose interaction testing, particularly an Orthogonal Array Testing Strategy
(OATS) as a systematic, statistical way of testing pair-wise interactions. In
software testing process (STP), it provides a natural mechanism for testing
systems to be deployed on a variety of hardware and software configurations.
The combinatorial approach to software testing uses models to generate a
minimal number of test inputs so that selected combinations of input values
are covered. The most common coverage criteria are two-way or pairwise
coverage of value combinations, though for higher confidence three-way or
higher coverage may be required. This paper presents some examples of
software-system test requirements and corresponding models for applying the
combinatorial approach to those test requirements. The method bridges
contributions from mathematics, design of experiments, software test, and
algorithms for application to usability testing. Also, this study presents a
brief overview of the response surface methods (RSM) for computer experiments
available in the literature. The Bayesian approach and orthogonal arrays
constructed for computer experiments (OACE) were briefly discussed. An
example, of a novel OACE application, to STP optimization study was also
given. In this case study, an orthogonal array for computer experiments was
utilized to build a second order response surface model. Gradient-based
optimization algorithms could not be utilized in this case study since the
design variables were discrete valued. Using OACE novel approach, optimum
combination of software defect detection techniques choices for every software
development phase that maximize all over Defect Detection Effectiveness of STP
were determined.
Keywords:
Software testing, Opzimization, Design of Experiments, Orthogonal array
Title of the Paper:
Datapath Error Detection with No Detection Latency for High-Performance
Microprocessors
DOWNLOAD FULL PDF
Authors:
Yung-Yuan Chen, Kuen-Long Leu, Kun-Chun Chang
Abstract: Error detection plays an important role in fault-tolerant computer
systems. Two primary parameters concerned for error detection are the coverage
and latency. In this paper, a new, hybrid error-detection approach offering a
very high coverage with zero detection latency is proposed to protect the data
paths of high-performance microprocessors. The feature of zero detection
latency is essential to real-time error recovery. The hybrid error-detection
approach is to combine the duplication with comparison, triple modular
redundancy (TMR) and self-checking mechanisms to construct a formal framework,
which allows the error-detection schemes of varying hardware complexity,
performance and error-detection coverage to be incorporated. An experimental
32-bit VLIW core was employed to demonstrate the concept of hybrid detection
approach. The hardware implementations in VHDL and simulated fault injection
experiments were conducted to measure the interesting design metrics, such as
hardware overhead, performance degradation and error-detection coverage.
Keywords:
Concurrent error detection, error-detection coverage, error-detection latency,
fault injection, hybrid detection approach.
Title of the Paper: A
Novel Boolean Algebraic Framework for Association and Pattern Mining
DOWNLOAD FULL PDF
Authors:
Hatim A. Aboalsamh
Abstract: Data mining has been defined as the non- trivial extraction of
implicit, previously unknown and potentially useful information from data.
Association mining and sequential mining analysis are considered as crucial
components of strategic control over a broad variety of disciplines in
business, science and engineering. Association mining is one of the important
sub-fields in data mining, where rules that imply certain association
relationships among a set of items in a transaction database are discovered.
In Sequence mining, data are represented as sequences of events, where order
of those events is important. Finding patterns in sequences is valuable for
predicting future events. In many applications such as the WEB applications,
stock market, and genetic analysis, finding patterns in a sequence of elements
or events, helps in predicting what could be the next event or element. At the
conceptual level, association mining and sequence mining are two similar
processes but using different representations of data. In association mining,
items are distinct and the order of items in a transaction is not important.
While in sequential pattern mining, the order of elements (events) in
transactions (sequences) is important, and the same event may occur more than
once. In this paper, we propose a new mapping function that maps event
sequences into itemsets. Based on the unified representation of the
association mining and the sequential pattern, a new approach that uses the
Boolean representation of input database D to build a Boolean matrix M.
Boolean algebra operations are applied on M to generate all frequent itemsets.
Finally, frequent items or frequent sequential patterns are represented by
logical expressions that could be minimized by using a suitable logical
function minimization technique.
Keywords:
Sequence mining, data mining, association mining, Boolean association
expressions, Boolean matrix, Association matrix.
Title of the Paper:
Lexicalized and Statistical Parsing of Natural Language Text in Tamil using
Hybrid Language Models
DOWNLOAD FULL PDF
Authors:
M. Selvam, A. M. Natarajan, R. Thangarajan
Abstract: Parsing is an important process of Natural Language Processing (NLP)
and Computational Linguistics which is used to understand the syntax and
semantics of a natural language (NL) sentences confined to the grammar. Parser
is a computational system which processes input sentence according to the
productions of the grammar, and builds one or more constituent structures
which conform to the grammar. The interpretation of natural language text
depends on the context also. Language models need syntax and semantic coverage
for the better interpretation of natural language sentences in small and large
vocabulary tasks. Though statistical parsing with trigram language models
gives better performance through tri-gram probabilities and large vocabulary
size, it has some disadvantages like lack of support in syntax, free ordering
of words and long term relationship. Grammar based structural parsing provides
solutions to some extent but it is very tedious for larger vocabulary corpus.
To overcome these disadvantages, structural component is to be involved in
statistical approach which results in hybrid language models like phrase and
dependency structure language models. To add the structural component, balance
the vocabulary size and meet the challenging features of Tamil language,
Lexicalized and Statistical Parsing (LSP) is to be employed with the
assistance of hybrid language models. This paper focuses on lexicalized and
statistical parsing of natural language text in Tamil language with
comparative analysis of phrase and dependency language models. For the
development of hybrid language models, new part of speech (POS) tag set with
more than 500 tags and dependency tag set with 31 tags for Tamil language have
been developed which have the wider coverage. Phrase and dependency structure
treebanks have been developed with 3261 Tamil sentences which cover 51026
words. Hybrid language models were developed using these treebanks, employed
in LSP and evaluated against gold standards. This LSP with hybrid language
models provides better results and covers all the challenging features of
Tamil language.
Keywords:
Dependency Structure, Hybrid Language Model, Lexicalized and Statistical
Parsing, Natural Language Processing, Part of Speech, Treebank, Phrase
Structure, Trigram Language Model, Tamil Language.
Issue 9, Volume 7, September 2008
Title of the Paper: WordNet-based Summarization of Unstructured Document
DOWNLOAD FULL PDF
Authors: Chenghua Dang,
Xinjun Luo, Haibin Zhang
Abstract: This paper presents an improved and practical approach to
automatically summarizing unstructured document by extracting the most
relevant sentences from plain text or html version of original document. This
technique proposed is based upon Key Sentences using statistical method and
WordNet. Experimental results show that our approach compares favourably to a
commercial text summarizer, and some refinement techniques improves the
summarization quality significantly.
Key-words:
Document Summarization, Key Sentence, WordNet, POS, Semantic Similarity
Title of the Paper: Data
Models for Retrieving Task-Specific and Technicians-Adaptive Hypermedia
DOWNLOAD FULL PDF
Authors: Ammar
M. Huneiti
Abstract: - This paper introduces a set of data models for facilitating the
retrieval of task-specific and user-adaptive hypermedia documents concerning
product fault diagnosis. These models include an integrated fault data model,
a stereotype user model, and a semantic product data model. Moreover, the
paper outlines the benefits of employing adaptive hypermedia to support the
performance of technicians specifically for product fault diagnosis. The
suggested stereotype user model represents the knowledge of the technician
regarding the performed task. This user model is then used for the adaptive
retrieval of finely separated and semantically classified product information
elements. A detailed example of how task-specific and user-centred hypermedia
can assist in synchronizing the output of a product diagnostic expert system
with the product technical documentation is introduced. A general architecture
for the suggested adaptive hypermedia system is outlined. The data models
proposed in this paper are demonstrated through a prototype adaptive expert
system for locating and correcting braking system faults in a forklift truck.
Key-Words:
- Adaptive hypermedia, Semantic data modeling, Performance support systems,
User modeling, Diagnostic expert systems.
Title of the Paper: A
Generalized Software Fault Classification Model
DOWNLOAD FULL PDF
Authors:
Omar Shatnawi, P. K. Kapur
Abstract: - Most non-homogenous Poisson process (NHPP) based software
reliability growth models (SRGMs) presented in the literature assume that the
faults in the software are of the same type. However, this assumption is not
truly representative of reality. It has been observed that the software
contains different types of faults and each fault requires different
strategies and different amount of testing-effort to remove it. If this
assumption is not taken into account, the SRGM may give misleading results.
This paper proposes a generalized model based on classification the faults in
the software system according to their removal complexity. The removal
complexity is proportional to the amount of testing-effort required to remove
the fault. The testing-effort expenditures are represented by the number of
stages required to remove the fault after the failure observation / fault
isolation (with time delay between the stages). Therefore, it explicitly takes
into account the faults of different severity and can capture variability in
the growth curves depending on the environment it is being used and at the
same time it has the capability to reduce either to exponential or S-shaped
growth curves. Such modelling approach is very much suited for object-oriented
programming and distributed development environments. Actual software
reliability data have been used to demonstrate the proposed generalized model.
Key-Words: - Software engineering, Software
testing, Software reliability, NHPP, SRGM, Fault severity.
Title of the Paper: Implementation of Real-Time Video Conference System
Using high speed Multimedia Communication
DOWNLOAD FULL PDF
Authors: Hyen Ki Kim
Abstract: - Recently, Peer to Peer(P2P) networks become more and more popular.
This paper describes an implementation of real-time video conference system
using high speed multimedia communication. The proposed real-time video
conference system has hybrid Peer to Peer architecture based on a
client-server and peer-to-peer, where client-server is used for exchange of
account management, client list and status information and P2P is used for the
real-time video conference. The proposed real-time video conference system
decreases the traffic of server, and can cuts down the load of multimedia
communication. Because the multimedia data is decentralized to client by
hybrid peer-to-peer network architecture. Also, the proposed system is
implemented and tested by the real-time video conference system using
communication protocol and application software through high speed multimedia
communication
Key-Words: - Real-Time, Video conference,
High speed, Hybrid, P2P, Multimedia, Communication.
Title of the Paper: Measurements to Determine the Effectiveness of the
Phobias Treatments
DOWNLOAD FULL PDF
Authors:
Mauricio Plaza
Torres
Abstract: With the levels of stress that involve the everyday activities of
human being, the number of phobias and the people who suffer of these phobias
has increased. The phobia treatments by direct exposure used by the
psychologists have shown be effective, but in many cases it can be a danger
for the physical and psychological health of the patient. In some treatments
additional trauma or physical damage might be caused as a consequence of an
incorrect control of the treatment; thinking about the decrease of the traumas
caused by the treatment of the direct exposition, some medical centers use new
technologies as the virtual reality. The doctor does not have a certain
control of the variables of measurement for the determination of the level in
the phobia conditions, which show an indication of the medical evolution. The
article explain the research with the objective to determine if the virtual
environment has any influence in the person’s psychological change seen
through the changes of the controlled vital signs.
Key - Words: virtual reality,
Experimentation, Security, Standardization, Measurement, Performance,
Reliability, phobia, vital signs, measurements, phobia environment, phobia
treatment Graphical environment, Interactive environments Design.
Title of the Paper: A
Model of Implicit Term Relationship for Information Retrieval
DOWNLOAD FULL PDF
Authors:
Tetsuya Yoshida
Abstract: This paper proposes a model for dealing with implicit term
relationship among terms toward information retrieval, in the context of
information retrieval over the Web. Until now various keyword based search
engines have been developed to facilitate information retrieval over the Web.
However, it can still be difficult to specify appropriate keywords (terms),
which are to be provided to the engines to conduct the retrieval. We
hypothesize that, although it is not explicitly represented or specified from
the user, there can be some (hidden) relationship among the specified terms.
Such relationship can be useful to facilitate effective retrieval, since it
can work as “between the terms”, as in the between the lines in effective
reading. Based on this hypothesis, we propose a model for representing the
implicit relationship among the specified terms. Our model tries to capture
the implicit relationship in terms of semantic aspect, and represents it as a
concrete tree structure so that it can be utilized for further processing.
Experiments were conducted to investigate the effectiveness of the proposed
model in the context of retrieval, and the results are reported.
Key–Words: Information Retrieval, Term
Relationship, Tree Structure Thesaurus, World Wide Web
Title of the Paper:
Assessing Object-oriented Programming Skills in the Core Education of Computer
Science and Information Technology: Introducing New Possible Approach
DOWNLOAD FULL PDF
Authors:
Norazlina Khamis, Sufian Idris, Rodina Ahmad, Norisma Idris
Abstract: - Deciding on how to evaluate each students programming skills is
one of the largest challenges facing educators who teach object-oriented
courses. Traditional assessment of programming education is with Grade. Quite
often students get good grades in programming but still facing great
challenges or have difficulties to take on real programming jobs. This
research focus on how we addressed this challenge in object oriented
programming course by proposing a new assessment method to assess students’
object-oriented programming skills. The process begins by identifying generic
object-oriented skills that students should acquired. In this paper we discuss
the issues on object-oriented programming assessment and our proposed solution
for a new assessment model. This followed by an approach taken in the process
of identifying the object-oriented skills using Delphi
technique. Delphi technique is a structured
multi-step process that uses a group of experts to achieve a consensus
opinion. We present the methodology of three Delphi
processes to identify object-oriented programming skills. The identified
skills will be used to guide both the coverage of student learning assessments
and can be used by instructors to identify what topics merit emphasis.
Key-Words: - Object-oriented programming,
assessment, programming skills, Delphi
technique, goal questions metrics.
Title of the Paper: A
Study of Issues and Considerations of Service Interaction Management in IMS
Architecture
DOWNLOAD FULL PDF
Authors:
Hui-Na Chua , Chor-Min Tan, Yuxin Ho
Abstract: - Though IMS (IP Multimedia subsystem) is aimed to provide an open
architecture environment for rapid service creation, it does not necessarily
solve all the problems of service interactions and service provisioning.
Service Brokering function as currently being studied by the 3GPP [1], is
aimed to manage service capabilities interaction between any type of IMS
application servers. However, the Service Broker definition in standards does
not specify precisely the mechanism of how it achieves the service interaction
management. Due to the definition is still not concrete, the Service Broker
function is currently implemented in a proprietary manner. In this paper, we
examine the evolution of Service Broker functionality proposed in standards
and evaluate the existing Service Broker approaches that are proprietarily
implemented. From architectural and interaction management aspects, we discuss
the issues and considerations of Service Broker function.
Key-Words: - IMS, Service Broker, SOA,
orchestration, Service Interaction and invocation.
Title of the Paper:
Transformations Techniques for extracting Parallelism in Non-Uniform Nested
Loops
DOWNLOAD FULL PDF
Authors:
Fawzy A. Torkey, Afaf A. Salah,
Nahed M. El Desouky, Sahar A. Gomaa
Abstract: - Executing a program in parallel machines needs not only to find
sufficient parallelism in a program, but it is also important that we minimize
the synchronization and communication overheads in the parallelized program.
This yields to improve the performance by reducing the time needed for
executing the program. Parallelizing and partitioning of nested loops requires
efficient iteration dependence analysis. Although many loop transformations
techniques exist for nested loop partitioning, most of these transformation
techniques perform poorly when parallelizing nested loops with non-uniform
(irregular) dependences. In this paper the affine and unimodular
transformations are applied to solve the problem of parallelism in nested
loops with nonuniform dependence vectors. To solve these problem few
researchers converted the non-uniform nested loops to uniform nested loops and
then find the parallelism. We propose applying directly the two approaches
affine and unimodular transformations to extract and improve the parallelism
in nested loops with non-uniform dependences. The study shows that unimodular
transformation is better than affine transformation when the dependences in
nested loops exist only in one statement. While affine transformation is more
effective when the nested loops have a sequence of statements and the
dependence exists between these different statements.
Keywords: - Unimodular transformation, Affine transformation, Parallelism,
Uniform dependence, Nonuniform dependence, Distance vector, Distance matrix
Title of the Paper:
Information Society and its Development in the
Czech
Republic
DOWNLOAD FULL PDF
Authors:
Jitka Komarkova,
Pavel Sedlak, Katerina Langrova
Abstract: - Development of information society has been strongly supported by
all governments including the government of the
Czech Republic.
Recently, ways how to evaluate efficiency of the investments and development
of information society have been searched for. Paper deals with utilization of
geographic information systems, spatial analyses and exploratory spatial data
analyses (both global and local indicators) for evaluation of development of
information society and participation of the citizens. Within the case study
development in all 14 regions of the Czech Republic
during last few years is evaluated. Evaluation is based on available data
which were provided by Czech Statistical Office. One of the results is that
Prague (the capital) was
a leader in the first years but during last few years the other regions have
made the gap on the capital smaller so many differences across the country
have been decreased.
Key-Words: - eGovernment, Information
Society, eParticipation, Spatial Analyses, ESDA, GIS, DEMO-net.
Title of the Paper:
Four-Dimensional Multi-Inkdot Finite Automata
DOWNLOAD FULL PDF
Authors:
Yasuo Uchida, Takao Ito, Hidenobu Okabe, Makoto Sakamoto, Hiroshi Furutani,
Michio Kono
Abstract: During the past about thirty-five years, many automata on a two- or
three-dimensional input tape have been proposed and a lot of properties of
such automata have been obtained. On the other hand, we think that recently,
due to the advances in computer animation, motion image processing, and so
forth, it is very useful for analyzing computational complexity of
multi-dimensional information processing to explicate the properties of
four-dimensional automata, i.e., three-dimensional automata with the time
axis. In this paper, we propose a fourdimensional multi-inkdot finite
automaton and mainly investigate its recognizability of four-dimensional
connected pictures. Moreover, we briefly investigate some basic accepting
powers of four-dimensional multi-inkdot finite automata.
Key–Words: Alternation, Connected Pictures, Finite Automaton, Four-Dimension,
Inkdot, Recognizability.
Title of the Paper: A
Back-End Compiler with Fast Compilation for VLIW based Dynamic Reconfigurable
Processor
DOWNLOAD FULL PDF
Authors:
Ryuji Hada,
Kazuya Tanigawa, Tetsuo Hironaka
Abstract: We have developed a compiler for dynamic reconfigurable processor
based on VLIW model. VLIW model fetches and executes one configuration data as
VLIW instruction. For this model, our compiler schedules mapping elements as
operations and live variables in program, with consideration of hardware
resources. Next, place-and-route procedure places and routes the mapping
elements to hardware resources for several configuration data. However the
conventional place-and-route algorithms require much compilation time. The
reason is that, for difficulty place-and-route condition, the number of
place-and-route iteration is increased to get high code quality. Thus we
propose a novel compiler method, which is combining scheduling and
place-and-route with fast compilation, on keeping code quality. Our idea is
that if a scheduling simplifies the place-and-route condition, small
compilation time of place-and-route can realize a reasonable code quality. In
scheduling, to balance the number of operations and live variables, and make
the place-and-route condition easy, the operations are moved to another step
with a fewer operations. In place-and-route, to reduce iteration procedures to
get the reasonable result, it limits targets for replace-and-reroute. In this
paper, we use PARS as one of target processors based on VLIW model. We
evaluate our method and compare it with another method based on Simulated
Annealing (SA). From the results, our method achieves that the difference of
code quality (the number of configuration data like VLIW instruction) is -3.4%
- +1.2%, and compilation time is cut to 1/128 - 1/67, compared with SA base
method.
Key-Words: VLIW, Reconfigurable processor,
compiler, scheduling, place and route.
Title of the Paper: An
Improved Steganalysis Approach for Breaking the F5 Algorithm
DOWNLOAD FULL PDF
Authors:
Hatim Aboalsamh, Hassan
Mathkour, Sami Dokheekh, Mona Mursi, Ghazyassassa
Abstract :In this paper, we present an enhancement to the steganalysis
algorithm that successfully attacks F5 steganographic algorithm using JPEG
digital images. The key idea is related to the selection of an “optimal” value
of â (the probability that a non-zero AC coefficient will be modified) for the
image under consideration. Rather than averaging the values of â for 64
shifting steps worked on an image, an optimal â is determined that corresponds
to the shift having minimal distance E from the double compression removal
step. Numerical experiments were carried out to validate the proposed enhanced
algorithm and compare it against the original one. Both algorithms were tested
and compared using two sets of test images. The first set uses reference test
data of 20 grayscale images [1], and the second uses 432 images created by
manipulating 12 images for various image parameters: two sizes (300x400 and
150x2000), six JPEG old quality factors (50, 60, 70, 80, 90, 100), and 3
message lengths (0, 1kB, 2 kB). The results suggest that the original
algorithm may be used as a classifier, since it shows a good detection
performance of both clean and stego test images; whereas, the proposed
enhanced algorithm may be used as an estimator for the true message length for
those images that have been classified by original algorithm as stego images.
Keywords: Steganalysis, Steganography, F5
algorithm, Discrete Cosine Transforms DCT, Matrix encoding, JPEG images.
Title of the Paper:
Context-Aware Remote Healthcare Support System based on Overlay Network
DOWNLOAD FULL PDF
Authors:
Debasish Chakraborty, Hideyuki Takahashi, Takuo Suganuma, Atsushi Takeda,
Gen Kitagata, Kazuo Hashimoto, Norio Shiratori
Abstract: Many countries are facing an ever-growing need to supply constant
care and support for their disabled and elderly populations. So the importance
to remote monitoring is becoming essential for looking after them. With an
environment full of smart and cooperating artifacts will at the same time pose
great risks to personal privacy. To protect personal information patient's
data should be available irrespective of their location, but only to the
authorized person. At the same time quality and reliability of transferring
data is also important depending on the content of the data and the recipient.
In this paper we propose a system where a personalized overlay network will be
built in an ad hoc basis and links between different entities will be
established according to the social relationship between the person under
observation and the people at the other end and the situation of the observed
person. The connected links will provide reliability, quality and other
required characteristics according to the requirements specialized by the
members involved. For efficient cost and resource utilization, an on-demand
network connection is considered for our proposed context aware ubiquitous
healthcare system.
Key-Words: Adaptive context aware, Overlay
networks, Ubiquitous healhcare, Multi-agent system, Security.
Issue 10, Volume 7, October 2008
Title of the Paper: The
Framework of the Turkish Syllable-Based Concatenative Text-to-Speech System
with Exceptional Case Handling
DOWNLOAD FULL PDF
Authors:
Zeynep Orhan,
Zeliha Gormez
Abstract: - This paper describes the TTTS (Turkish Text-To-Speech) synthesis
system, developed at Fatih
University for Turkish
language. The framework of the Turkish syllable-based concatenative
text-to-speech system with exceptional case handling is introduced. TTTS is a
concatenative TTS system aiming to advance the process of developing natural
and human sounding Turkish voices. The resulting system is implemented by
concatenating pieces of pre-recorded limited number of speech units that are
stored in a database. Systems differ in the size of the stored speech units
affecting the output range, the quality and the clarity, therefore the number
of the concatenation units, synthetic units obtained and the computational
power required should be kept in balance. The letters of the Turkish alphabet
and the syllables that consist of two letters at most are used as the smallest
phonemes in the context of this study. The syllables that have more than two
letters are derived from these smallest units. The words, which are generally
borrowed from other languages throughout cultural interactions, present
exceptional behaviors and should be handled specifically. The results are
evaluated by using the Degradation Mean Opinion Score (DMOS) method.
Key-Words: - Text-to-speech (TTS), Speech
Synthesis, Concatenative Turkish TTS.
Title of the Paper:
Simulation of Production and Transportation Planning with Uncertainty and Risk
DOWNLOAD FULL PDF
Authors:
Kuentai
Chen, Hung-Chun Chen, Z. H. Che
Abstract: - Inevitable in the practical supply chain planning, uncertainties,
including unsure demand and various risks such as machine failure and
transportation loss, are fundamental issues for all members of the supply
chain. In this research, a mathematic model of supply chain with risk and
uncertain demand are established and solved. The inherent complexity of such
an integer programming model leads to the solving difficulty in speedily
finding exact and integer optimal solutions. Therefore, a quick and decent
answer becomes essential to pace up with the competitive business world, even
it is usually only an approximate estimate. Four types of model are discussed
in this study, including certain demand without risk, certain demand with
risk, uncertain demand without risk, and uncertain demand with risk. After
model verification and validation, computer simulations are performed with
three selecting policies, namely “low cost first”, “random”, and “minimum cost
path”. The results are analyzed and compared, in which the “minimum cost path”
is the better policy for node selection according to simulation runs. A
general linear programming solver called LINDO was used to find the optimal
solutions but took days as the problem size increases, while simulation model
obtains an acceptable solution in minutes. For small size problems, numerical
examples show that the Mean Absolute Percentage Error (MAPE) between integer
simulation solution and mathematical non-integer solution falls into the range
of 3.69% to 7.34%, which demonstrates the feasibility and advantage of using
simulation for supply chain planning.
Key-Words: Supply Chain, Risk, Simulation,
Integer Programming, Uncertainty.
Title of the Paper: Sequential
and Parallel Deficit Scaling Algorithms for Minimum Flow in Bipartite Networks
DOWNLOAD FULL PDF
Authors:
Laura Ciupala, Eleonor Ciurea
Abstract: - In this paper, first we describe the deficit scaling algorithm for
minimum flow in bipartite networks. This algorithm is obtained from the
deficit scaling algorithm for minimum flow in regular networks developed by
Ciupală in [5] by replacing a pull from a node with sufficiently large deficit
with two consecutive pulls. This replacement ensures that only nodes in N1 can
have deficits. Consequently, the running time of the deficit scaling algorithm
for minimum flow is reduced from O(nm+n2 logC) to O(n1m+n12 logC) when it is
applied on bipartite networks. In the last part of this paper, we develop a
parallel implementation of the deficit scaling algorithm for minimum flow in
bipartite networks on an EREW PRAM. The parallel bipartite deficit scaling
algorithm performs a pull from an active node with a sufficiently large
deficit and with the smallest distance label from N1 at a time followed by a
set of pulls from several nodes in N2 in parallel. It runs in O(n12 log C log
p) time on an EREW PRAM with p = m/n1 processors, which is within a
logarithmic factor of the running time of the sequential bipartite deficit
scaling algorithm for minimum flow.
Key-Words: - Network flow; Network algorithms; Bipartite network; Parallel
algorithms; Minimum flow problem; Scaling technique
Title of the Paper: Segmentation Techniques for Target
Recognition
DOWNLOAD FULL PDF
Authors:
G.
N. Srinivasan, Shobha G.
Abstract:
This paper presents an overview of the methodologies and algorithms for
segmenting 2D images as a means in detecting target objects embedded in visual
images for an Automatic Target Detection application.
Keywords: Target Detection, Image Processing, Pattern Recognition,
Segmentation.
Title of the Paper:
Multimedia Interactive Environment for Study the Plane Analytical Geometry
DOWNLOAD FULL PDF
Authors: Anca
Iordan, George Savii, Manuela Panoiu, Caius Panoiu
Abstract: - In this work will be presented the elaboration of an educational
informatics system for studying plane analytical geometry elements. The
achieved informatics system will be able to be used for teaching geometry,
both in pre-university and in university education. Incorporating modern
methods and techniques, the presented software will lead the user to obtain
experience in understanding and handling the knowledge in geometry field and
will grant easy and efficient access to the newest information and knowledge.
Key-Words: - Interactive software, plane analytical geometry, Java, distance
education.
Title of the Paper:
Reduced-Set Vector-Based Interval Type-2 Fuzzy Neural Network
DOWNLOAD FULL PDF
Authors: Long
Yu, Jian Xiao, Song Wang
Abstract: - This paper describes an interval type-2 fuzzy modeling framework,
reduced-set vector-based interval type-2 fuzzy neural network (RV-based
IT2FNN), to characterize the representation in fuzzy logic inference
procedure. The model proposed introduces the concept of interval kernel to
interval type-2 fuzzy membership, and provides an architecture to extract
reduced-set vectors for generating interval type-2 fuzzy rules. Thus, the
overall RV-based IT2FNN can be represented as series expansion of interval
kernel, and it does not have to determine the number of rules in advance. By
using a hybrid learning mechanism, the proposed RV-based IT2FNN can construct
an input-ouput mapping from the training data in the form of fuzzy rules. At
last, simulation results show that the RV-based IT2FNN obtained possesses nice
generalization and transparency.
Key-Words: - interval type-2; fuzzy modeling; reduced-set; interval kernel;
insensitive learning.
Title of the Paper:
Grammar-based Classifier System: A Universal Tool for Grammatical Inference
DOWNLOAD FULL PDF
Authors:
Olgierd Unold
Abstract: - Grammatical Inference deals with the problem of learning
structural models, such as grammars, from different sort of data patterns,
such as artificial languages, natural languages, biosequences, speech and so
on. This article describes a new grammatical inference tool, Grammar-based
Classifier System (GCS) dedicated to learn grammar from data. GCS is a new
model of Learning Classifier Systems in which the population of classifiers
has a form of a context-free grammar rule set in a Chomsky Normal Form. GCS
has been proposed to address both regular language induction and the natural
language grammar induction as well as learning formal grammar for DNA
sequence. In all cases near-optimal solutions or better than reported in the
literature were obtained.
Key-Words: - Machine Learning, Grammatical Inference, Learning Classifier
Systems, Regular Language Induction, DFA Induction, Natural Language
Processing, Promoter Recognition.
Title of the Paper:
Several Aspects of Context Freeness for Hyperedge Replacement Grammars.
DOWNLOAD FULL PDF
Authors: Silviu
Dumitrescu
Abstract: - In this paper we survey several aspects related to normal forms of
hyperedge replacement grammars. Considering context free hyperedge replacement
grammars we introduce, inspired by string grammars, Chomsky Normal Form and
Greibach Normal Form. The algorithm of conversion is quite the same with the
algorithm for string grammars. The important difference is related to the fact
that hyperedge grammars are two-dimensional and that’s why parsing
productions, in order to transform into string grammars, can be done only
nondeterministic. A detailed example of conversion to both normal forms is
introduced to clarify all the algorithm steps.
Key-Words: - Hyperedge Replacement Grammars, Normal Form, Chomsky, Greibach,
Context Freeness, Nondeterministic.
Title of the Paper: Using
COBIT Indicators for Measuring Scrum-based Software Development
DOWNLOAD FULL PDF
Authors: Viljan
Mahnic, Natasa Zabkar
Abstract: - The aim of this paper is to determine the level of compliance of
AGIT model, developed during our previous research for measuring Scrum-based
software development, with the information systems auditing criteria. For this
purpose we use COBIT model. After a short introduction of Scrum, AGIT and
COBIT, we perform comparison analysis of their indicators for software
development. Then we upgrade AGIT model with the selected COBIT indicators. In
order to improve the clarity of the model, we present its structure using IT
Balanced Scorecard. Finally we suggest possible further research.
Key-Words: - Scrum, Agile software development, IT performance measurement, IT
indicators, IT Balanced Scorecard, COBIT, AGIT.
Title of the Paper:
Counter Register. Algebraic Model and Applications
DOWNLOAD FULL PDF
Authors:
Anca Vasilescu
Abstract: Hardware system consists of interconnected components, meaning
communicating and synchronized components. Since the number of interconnected
components in a computer system is continuously increasing, it follows that it
is useful to have an alternative solution for verifying the computer operation
instead of a simulation based
verification. In this paper we consider a specific component of the
modern computer systems, namely a counter register, and we propose an
algebraic approach as a solution for modeling and verifying the specification
agents. As a final part of the paper, we mention some practical applications
of the counter register, both in the everyday life and for the internal
structure of the computer systems.
Key–Words: communication, counting operation, hardware system, modeling, SCCS
process algebra, synchronization, verification.
Title of the Paper: A New
Scalable Distributed Authentication for P2P Network and its Performance
Evaluation
DOWNLOAD FULL PDF
Authors:
Atushi Takeda,
Debashish Chacraborty, Gen Kitagata, Kazuo Hashimoto, Norio Shiratori
Abstract: Recently P2P networks become more and more popular. Though they have
many advantages, P2P networks suffer from authentication of nodes. To overcome
this problem, a new authentication method called Hash-based Distributed
Authentication Method (HDAM) is proposed in this paper. HDAM realizes a
decentralized efficient mutual authentication mechanism for each pair of nodes
in the P2P network. It performs a distributed management of public keys by
using Web of Trust and Distributed Hash Table. Our proposed scheme
significantly reduces both the memory size requirement and the overhead of
communication data sent by the nodes. Additionally, the results also show that
required resource size of HDAM is O(log n) and HDAM is more scalable than the
conventional method.
Key–Words: Distributed authentication, Decentralized public key exchange,
Peer-to-peer network
Title of the Paper: A
Survey of Automata on Three-Dimensional Input Tapes
DOWNLOAD FULL PDF
Authors:
Makoto Sakamoto,
Naoko Tomozoe, Hiroshi Furutani, Michio Kono, Takao Ito, Yasuo Uchida,
Hidenobu Okabe
Abstract: The question of whether processing three-dimensional digital
patterns is much more difficult than two dimensional ones is of great interest
from the theoretical and practical standpoints. Recently, due to the advances
in many application areas such as computer vision, robotics, and so forth, it
has become increasingly apparent that the study of three-dimensional pattern
processing has been of crucial importance. Thus, the research of
threedimensional automata as computational models of three-dimensional pattern
processing has also been meaningful. The main purpose of this paper is to
survey the definitions and properties of various three-dimensional automata.
Key–Words: Computation, Constructability, Finite Automaton, Inkdot, Marker,
Recognizability, Three-Dimension, Turing Machine
Title of the Paper: A
General Approach to Off-line Signature Verification
DOWNLOAD FULL PDF
Authors:
Bence Kovari, Istvan Albert, Hassan Charaf
Abstract: - Although automatic off-line signature verification has been
extensively studied in the last three decades, there are still a huge number
of open questions and even the best systems are still struggling to get better
error rates than 5%. This paper targets some of the weak spots of the research
area, which are comparability, measurability and interoperability of different
signature verification systems. After delivering an overview of some of the
main research directions, this paper proposes a generic representation of
signature verifiers. In the first part of the paper it is shown how existing
verification systems compare to the generic model, detailing the differences
and their resolutions. In the second part a signature verification framework
is introduced, which was created based on the generic model. It is
demonstrated how existing algorithms and even an existing signature verifier
can be modularized and modified to allow an execution through the framework.
Finally the benefits of the model are outlined including the unified
benchmarking, comparability of different systems and the support for
distributed software architectures like SOA.
Key-Words: - signature verification; off-line; unified model; component based;
loose coupling
Title of the Paper:
Implementation of Semantic Services in
Enterprise
Application Integration
DOWNLOAD FULL PDF
Authors:
Bence Peter
Martinek, Balazs Tothfalussy, Bela Szikora
Abstract: - In this paper, we present an approach for the implementation of
semantically enriched services in Enterprise Application Integration (EAI). We
present an integration platform based on a Service Oriented Architecture (SOA)
which consists of a service registry, a process designer and a run-time
engine. There are some additional components for realizing semantic enrichment
of services and composed processes e.g. the semantic profiler and the
Ontology. The focus of the paper is the preparation for the process run-time.
We propose a mediator based approach where data transformations are assigned
to each service during the deployment. The standard services of ERP, CRM, SCM
etc. systems are encapsulated into mediator services which makes possible to
apply them in a semantic integration framework. Still, created semantic
services remain compatible with current Web service standards and communicate
with standard SOAP messages. Hence the collaborative processes composed by
attached semantic meta-information of services are also executable by standard
Business Process Execution Language (BPEL) run-time engine.
Key-Words: - Enterprise
application integration, Semantic services’ run-time, Collaborative business
processes.
Title of the Paper: Key
Consumer Group for Late Comers in Network Effect Product Markets: A
Multi-agent Modeling Approach
DOWNLOAD FULL PDF
Authors: Zhou
Geng
Abstract: - The first movers in network effect market enjoy the first-mover
advantages. However, through proper strategy, late comers still hold a chance
to win the competition. I divide the consumers into 5 categories (active
rational, passive rational, active non-rational, passive non-rational and
sheep herd) and simulate the market with multi-agent modeling method. Through
analyzing the interactions between these consumer groups, I find that the
active non-rational group is the most important consumer group for the late
comers—late comers should focus on attracting this group first so that they
may compete with the first movers. In support of my study, I also provide case
studies of China’s blog
service market and China’s
online game market.
Key-Word: - multi-agent modeling, network effect, consumer group, first mover
advantage, consumer behavior, decision making, herd behavior.
Title of the Paper:
Implementation of Classifiers for Choosing Insurance Policy Using Decision
Trees: A Case Study
DOWNLOAD FULL PDF
Authors:
Chin-Sheng
Huang, Yu-Ju Lin, Che-Chern Lin
Abstract: - In this paper, we use decision trees to establish the decision
models for insurance purchases. Five major types of insurances are involved in
this study including life, annuity, health, accident, and investment-oriented
insurances. Four decision tree methods were used to build the decision models
including Chi-square Automatic Interaction Detector (CHAID), Exhaustive
Chi-square Automatic Interaction Detector (ECHAID), Classification and
Regression Tree (CRT), and Quick-Unbiased-Efficient Statistical Tree
(QUEST).Six features were selected as the inputs of the decision trees
including age, sex, annual income, educational level, occupation, and risk
preference. Three hundred insurants from an insurance company in
Taiwan
were used as examples for establishing the decision models. Two experiments
were conducted to evaluate the performance of the decision trees. The first
one used the purchase records of primary insurances as examples. The second
one used the purchase records of primary insurances and additional insurances.
Each experiment contained four rounds according to different partitions of
training sets and test sets. Discussion and concluding remarks are finally
provided at the end of this paper.
Key-Words: - Insurance policy; Decision tree; Decision model; ECHAID; CRT;
CHAID; QUEST; Classification tree; Decision support system..
Title of the Paper:
Design of Individualizing Learning Framework with Fuzzy Expert System and
Variable Learning Route
Model: A Case Study of UML
DOWNLOAD FULL PDF
Authors:
Che-Chern Lin,
Shen-Chien Chen, Chin-Chih Lin
Abstract: - In this paper, we discuss how a fuzzy expert system is utilized in
education. We present a conceptual framework for designing individualizing
learning materials using a fuzzy expert system and a variable learning route
model. The framework can help teachers to design their customized teaching
materials for individual students
based on the academic achievements of the students. In the framework, we first
use pre-assessment to evaluate the students’ academic achievements. The fuzzy
expert system is then used to select
suitable learning material for the students according to their academic
achievements. The variable learning route model serves to determine the
adaptive learning paths for the students based on the results of the fuzzy
expert system. We introduce the concepts of the learning model with a
simulative example. Unified Modeling Language (UML) is utilized in this paper
to describe the structures and behaviors of the proposed framework. Discussion
and concluding remarks are finally provided at the end of this paper.
Key-words: Fuzzy expert system, Variable route model, Individual learning,
Adaptive learning, UML, OOAD, System analysis and design.
Title of the Paper:
Managing Ontology Change and Evolution via a Hybrid Matching
DOWNLOAD FULL PDF
Authors:
Saravanan
Muthaiyah, Marcel Barbulescu, Larry Kerschberg
Abstract: - In this paper, we present the problem of ontology evolution and
change management. We provide a systematic approach to solve the problem by
adopting a multi-agent system (MAS). The core of our solution is the Semantic
Relatedness Score (SRS) which is an aggregate score of five well-tested
semantic as well as syntactic algorithms. The focus of this paper is to
resolve current problems related to ontology upgrade and managing evolution
amongst shared ontologies. This paper highlights issues pertaining to
ontological changes in a shared ontology environment which includes creating,
renaming, deletion and modification of existing classes. These changes will
definitely impact shared concepts and users would have to update their local
ontologies to be consistent with changes in the commonly shared ontology. We
propose a less laborious method to achieve this by using a semi-automated
approach where a bulk of the processing is carried out by matching agents that
would eliminate extraneous data and hence would only recommend to the
ontologist data that can actually be upgraded. We have also designed and built
a prototype in the Java Agent DEvelopment Framework (JADE) for
proof-of-concept.
Key-Words: - Hierarchical Repository, Semantic Matching, Syntactic Matching,
Agent, Ontology.
Title of the Paper:
Development of Bio-Mimetic Entertainment Dolphin Robots
DOWNLOAD FULL PDF
Authors:
Daejung Shin,
Seung Y. Na, Jin Y. Kim, Yong-Gwan Won, Bae-Ho Lee
Abstract: - Development of bio-mimetic entertainment dolphin robots that act
like real dolphins in terms of autonomous swimming and human-dolphin
interactions are introduced. Body structures, sensors and actuators, governing
microcontroller boards, swimming and interaction features are described for a
typical entertainment dolphin robot. Actions of mouth-opening, tail splash or
water blow through a spout hole are the typical responses of interaction when
touch sensors on its body detect users’ demand. A pair of microphones as the
ears of a dolphin robot, in order to improve the entertainment dolphin robot’s
ability to interact with people, is used to estimate the peak sound directions
from surrounding viewers. Dolphin robots should turn towards people who demand
to interact with them, while swimming autonomously.
Key-Words: - Entertainment Dolphin Robot, Bio-mimetic, Interaction, Autonomous
Dolphin System
Title of the Paper: MPI
based Parallel Method of Moments Approach for Microstrip Structures Analysis
DOWNLOAD FULL PDF
Authors:
Francisco
Cabrera, Eugenio Jimenez
Abstract: In this paper we will present a parallel Method of Moments (MoM for
short) technique using the MPI library. Here, the MoM is used to analyze
microstrip structures. The main goals to achieve are efficient parallel coef[1]cient computation and ef[1]cient linear equation
system solving. The efficiency and accuracy of the parallel processing MoM
code is analyzed through several examples with two data distribution using
ScaLAPACK library.
Key–Words: MoM, microstrip, parallel, MPI, ScaLAPACK
Title of the Paper:
Aspects of Dictionary Making Developing an In-House Text Parsing Tool
DOWNLOAD FULL PDF
Authors:
Livia
Sangeorzan, Marinela Burada, Kinga Kiss Iakab
Abstract: - This paper reports on particular aspects of ongoing research
funded by the National University Research Council and conducted by an
interdisciplinary team of academics from Transilvania University of Braşov,
Romania, of which the authors of the present contribution are members. Based
on the results yielded by a large-scale survey of seventy online
bilingual/multilingual dictionaries involving the English and Romanian
domains, we begin with an assessment of the status quo in the area of
glossaries and dictionaries available on the internet; we then focus on one
particular aspect of dictionary design, i.e. the development and operation of
a flexible, customizable scanner-parser that we designed with a view to
optimizing the work associated with data collection and dictionary compiling.
Key-Words: - accessibility, customizability, interstructure, Java,
macrostructure, online dictionary, parsing tool
Title of the Paper: An
Arbitration Web Service for E-learning based on XML Security Standards
DOWNLOAD FULL PDF
Authors:
Robert Andrei
Buchmann, Sergiu Jecan
Abstract: This paper promotes a non-repudiation system for student evaluation
in an e-learning environment based on web services, AJAX frameworks and PEAR
packages in order to implement XML security standards, to provide improved
user experience, asynchronous data exchange and message authentication for
on-line test papers. The motivation of this research is the need to arbitrate
on-line evaluation for both parties, the e-teacher and the e-student, the
evaluation criteria and the rating against open answers.
Keywords: - e-learning, XML, digital signature,
AJAX.
Title of the Paper:
Querying XML Documents with XPath/XQuery in Presence of XLink Hyperlinks
DOWNLOAD FULL PDF
Authors:
Lule Ahmedi,
Mentor Arifaj
Abstract: Nowadays XML documents are prezented into the Web more than ever.
Using XLink hyperlinks, a docu-ment can refer different portions of
information in different documents into the Web. With XLink hyperlinks the XML
documents in the Web can be investigated from querying point of view not just
while browsing these docu-ments. There are few implementations regarding XLink
recommendation and just two implementations regarding quering XML documents
using XPath/XQuery expressions in presence of XLink hyperlinks. On the other
side XPath/XQuery has many implementations and wide support from IT leaders in
the world. In this paper it is shown another way of using XPath/XQuery to
query XML documents in presence of hyperlinks. The paper shows a model and its
implementation. To materialize the new model it is used Saxon-B processor,
with which was created an ex-tention named BOTA. Prototype BOTA can be used in
different web applications and easly can be integrated into Web Services in
order to select something more narrow that the whole XML document. The
prototype Bota espe-cially can be used in applications with RSS and Atom.
Key-Words: - XML, Querying XML, XLink hyperlinks, XML Applications, LDAP.
Title of the Paper:
Real-Time Background Subtraction using Adaptive Thresholding and Dynamic
Updating for Biometric Face Detection
DOWNLOAD FULL PDF
Authors: K.
Sundaraj
Abstract: Face biometrics is an automated method of recognizing a person’s
face based on a physiological or behavioral characteristic. Face recognition
works by first obtaining an image of a person. This process is usually known
as face detection. In this paper, we describe an approach for face detection
that is able to locate a human face embedded in an outdoor or indoor
background. Segmentation of novel or dynamic objects in a scene, often
referred to as background subtraction or foreground segmentation, is a
critical early step in most computer vision applications in domains such as
surveillance and human-computer interaction. All previous implementations aim
to handle properly one or more problematic phenomena, such as global
illumination changes, shadows, highlights, foreground-background similarity,
occlusion and background clutter. Satisfactory results have been obtained but
very often at the expense of real-time performance. We propose a method for
modeling the background that uses per-pixel time-adaptive Gaussian mixtures in
the combined input space of pixel color and pixel neighborhood. We add a
safety net to this approach by splitting the luminance and chromaticity
components in the background and use their density functions to detect shadows
and highlights. Several criteria are then combined to discriminate foreground
and background pixels. Our experiments show that the proposed method possesses
robustness to problematic phenomena such as global illumination changes,
shadows and highlights, without sacrificing real-time performance, making it
well-suited for a live video event like face biometric that requires face
detection and recognition.
Key–Words: Background Modeling, Face Detection, Biometric Identification.
Title of the Paper: The
Development of Evaluation Indicators for LEGO Multimedia Instructional
Material
DOWNLOAD FULL PDF
Authors: Eric
Zhi Feng Liu, Shan Shan Chen, Chun Hung Lin, Yu Fang Chang, Wen Ting Chen
Abstract: - The robotics education is more and more important, but there is a
lack of evaluation indicators for evaluating robotics multimedia instructional
material. Therefore, the researchers developed evaluation indicators for
robotics multimedia instructional material in this paper. The researchers
applied content analysis in the first stage, and then applied Delphi technique and invited 2 robotics teachers and 4
experts of multimedia instructional material designer and 4 users of robotics
multimedia instructional material to develop the scale. The well developed
scale included 40 evaluative indicators can be classified into 4 factors that
are motivation, interface design, content, and feasibility.
Key-Words: - LEGO, MINDSTORMS NXT, Robot, Evaluation indicator, Multimedia
instructional material
Title of the Paper:
Solving Large Scale Optimization Problems by Opposition-Based Differential
Evolution (ODE)
DOWNLOAD FULL PDF
Authors: Shahryar
Rahnamayan, G. Gary Wang
Abstract: This work investigates the performance of Differential Evolution
(DE) and its opposition-based version (ODE) on large scale optimization
problems. Opposition-based differential evolution (ODE) has been proposed
based on DE; it employs opposition-based population initialization and
generation jumping to accelerate convergence speed. ODE shows promising
results in terms of convergence rate, robustness, and solution accuracy. A
recently proposed seven-function benchmark test suite for the CEC-2008 special
session and competition on large scale global optimization has been utilized
for the current investigation. Results interestingly confirm that ODE
outperforms its parent algorithm (DE) on all high dimensional (500D and 1000D)
benchmark functions (F1-F7). Furthermore, authors recommend to utilize ODE for
more complex search spaces as well. Because results confirm that ODE performs
much better than DE when the dimensionality of the problems is increased from
500D to 1000D. All required details about the testing platform, comparison
methodology, and also achieved results are provided.
Key–Words: Opposition-Based Differential Evolution (ODE), Opposition-Based
Optimization (OBO), Opposition- Based Computation (OBC), Cooperative Co
evolutionary Algorithms (CCA), Large Scale Optimization, Scalability,
High-Dimensional Problems
Title of the Paper: A
Complete Analyze of using Shift Registers in Cryptosystems for Grade 4, 8 and
16 Irreducible Polynomials
DOWNLOAD FULL PDF
Authors: Mirella
Amelia Mioc
Abstract: A Linear Feedback Shift Register (LFSR) is always the kernel of any
digital system based on pseudorandom bits sequences and is frequently used in
cryptosystems, in codes for errors detecting, in wireless system
communication. The Advanced Encryption System (Rijndael) is based on using a
grade 8 irreducible polynomials in a Galois Field. For a better understanding
this study contains aspects of functioning for Linear Feedback Shift Register
and Multiple Input-Output Shift Register (MISR) using grade 4, 8 and 16
irreducible polynomials. This experiment shows that the Linear Feed-back Shift
Register and Multiple Input-Output Shift Register have the same function. The
conclusion of this paper is that for grade 8 and 16 irreducible polynomials
the weights are calculated with a formula discovered in this work.
Key-Words: Cryptosystem, Shift registers, Calculate, Irreducible polynomials,
Simulate, Rijndael,Pseudo-Random Sequence, Error Detect.
Issue 11, Volume 7, November 2008
Title of the Paper: Short Term Wind Speed Prediction using Support Vector
Machine Model
DOWNLOAD FULL PDF
Authors: K.
Sreelakshmi, P. Ramakanth Kumar
Abstract – Wind speed prediction in short term is required to asses the effect
of wind on different objects in action in free space, like rockets, navigating
ships and planes, guided missiles satellites in launch etc. Forecasting also
helps in usage of wind energy as an alternative source of energy in Electrical
power generation plants. The wind speed depends on the values of other
atmospheric variables, such as pressure, moisture content, humidity, rainfall
etc. This paper reports a Support Vector Machine model for short term wind
speed prediction. The model uses the values of these parameters, obtained from
a nearest weather station, as input data. The trained model is validated using
a part of data. The model is then used to predict the wind speed, using the
same meteorological information.
Keywords —
Short term wind speed prediction, Support Vector Machine [SVM], forecasting,
hyper plane, kernels, classification
Title of the Paper: Color Correction for Multi-view Video Based on
Background Segmentation and Dominant Color Extraction
DOWNLOAD FULL PDF
Authors: Feng
Shao, Zongju Peng, You Yang
Abstract: - Color correction is a necessary operation in multi-view video
processing, because color tends to be influenced by camera characteristic,
surface reflectance or scene illumination. To achieve high-quality correction
results, a new color correction method, based on the theoretical model of
multi-view imaging and image restoration, is proposed in this paper.
Considering the illumination problem in multi-view imaging, foreground and
background regions are separated from images and dominant color extraction is
used only for background regions of reference and input images, so that
uniform reference surface information is used. Then, color correction is
extended to video sequences with a tracking approach. Furthermore, an
objective performance evaluation is proposed to evaluate the color correction.
We present a variety of results for different test sequences, arguing that
background-based method outperforms foreground-based method, and better
subjective and objective visual effect can be achieved for images as well as
videos.
Key-Words:
- Multi-view video, color correction, background segmentation, dominant color
extraction, principal component analysis, color difference.
Title of the Paper: Power-Aware Hybrid Intrusion Detection System (PHIDS)
using Cellular Automata in Wireless Ad Hoc Networks
DOWNLOAD FULL PDF
Authors: P.
Kiran Sree, I. Ramesh Babu, J. V. R. Murty, R. Ramachandran, N. S. S. S. N
Usha Devi
Abstract :Ad hoc wireless network with their changing topology and distributed
nature are more prone to intruders. The network monitoring functionality
should be in operation as long as the network exists with nil constraints. The
efficiency of an Intrusion detection system in the case of an ad hoc network
is not only determined by its dynamicity in monitoring but also in its
flexibility in utilizing the available power in each of its nodes. In this
paper we propose a hybrid intrusion detection system, based on a power level
metric for potential ad hoc hosts, which is used to determine the duration for
which a particular node can support a network-monitoring node. Power –aware
hybrid intrusion detection system focuses on the available power level in each
of the nodes and determines the network monitors. Power awareness in the
network results in maintaining power for network monitoring, with monitors
changing often, since it is an iterative power-optimal solution to identify
nodes for distributed agent-based intrusion detection. The advantage that this
approach entails is the inherent flexibility it provides, by means of
considering only fewer nodes for re-establishing network monitors. The
detection of intrusions in the network is done with the help of Cellular
Automata (CA). The CA’s classify a packet routed through the network either as
normal or an intrusion. The use of CA’s enable in the identification of
already occurred intrusions as well as new intrusions.
Issue 12, Volume 7, December 2008
Title of the Paper: Power Efficiency Study of Multi-threading Applications
for Multi-core Mobile Systems
DOWNLOAD FULL PDF
Authors: Marius Marcu,
Dacian Tudor, Sebastian Fuicu, Silvia Copil-Crisan, Florin Maticu, Mihai Micea
Abstract: One constant in computing which is true also for mobile computing is
the continue requirement for greater performance. Every performance advance in
mobile processors leads to another level of greater performance demands from
newest mobile applications. However, on battery powered devices performance is
strictly limited by the battery capacity, therefore energy efficient
applications and systems have to be developed. The power consumption problem
of mobile systems is in general a very complex one and remained very actual
for quite a long time. In this paper we aim to define a software execution
framework for mobile systems in order to characterize the power consumption
profile of multi-threading mobile applications. Study results for different
thread libraries, multi-core processors and multithreaded parallelized
applications are also presented.
Key-Words: Power consumption, multi-threading, multi-core, mobile
applications, power profiling
Title of the Paper: Modelling a Plasma System for Soliton and Shockwaves
with a Splitting Scheme and a Second and a Third Order High Resolution Scheme
DOWNLOAD FULL PDF
Authors: R. Naidoo
Abstract: A splitting and a fully discrete second order and a third order semi
discrete schemes are modified and adapted for a numerical solution of a
hyperbolic system of a one dimensional electrostatic plasma fluid equation.
Illustrations as to how the splitting and the NNT (fully discrete) and the
semi-discrete (SD3) schemes capture the formation and evolution of ion
acoustic solitons and shockwaves were performed. In this study we perform a
comparison between a fully discrete NNT and the semi discrete SD3 high
resolution schemes and the splitting scheme which is constructed for the first
time for a one dimensional plasma systems in the present study. The results
indicate that the splitting scheme demonstrates clear superiority over the NNT
and SD3 schemes in the soliton solution where the numerical noise of the
electron waves is reduced significantly. For the shock wave solution the NNT
and SD3 schemes are similar to the splitting scheme but exhibit oscillations
at the contact discontinuity. However the splitting scheme exhibit a smaller
computational time than the NNT and SD3 schemes. It is thus advocated that in
a one dimensional plasma system for solution and shockwaves simulations the
splitting scheme and NNT /SD3 schemes be utilised respectively.
Key-Words: Solitons,shockwaves,hyperbolic,plasma,split scheme,high resolution
scheme
Title of the Paper: Quality Inspection of Textile Artificial Textures Using
a Neuro-Symbolic Hybrid System Methodology
DOWNLOAD FULL PDF
Authors: Vianey Guadalupe
Cruz Sanchez, Osslan Osiris Vergara Villegas, Gerardo Reyes Salgado, Humberto
De Jesus Ochoa Dominguez
Abstract: In the industrial sector there are many processes where the visual
inspection is essential, the automation of that processes becomes a necessity
to guarantee the quality of several objects. In this paper we propose a
methodology for textile quality inspection based on the texture cue of an
image. To solve this, we use a Neuro-Symbolic Hybrid System (NSHS) that allow
us to combine an artificial neural network and the symbolic representation of
the expert knowledge. The artificial neural network uses the CasCor learning
algorithm and we use production rules to represent the symbolic knowledge. The
features used for inspection has the advantage of being tolerant to rotation
and scale changes. We compare the results with those obtained from an
automatic computer vision task, and we conclude that results obtained using
the proposed methodology are better.
Key-Words: Computer Vision, Neuro-Symbolic Hybrid Systems, Artificial Neural
Networks, Production Rules
Title of the Paper: Identifying Appropriate Methodologies and Strategies
for Vertical Mining with Incomplete Data
DOWNLOAD FULL PDF
Authors: Faris Alqadah,
Zhen Hu, Lawrence J. Mazlack
Abstract: Many data mining methods are dependent on recognizing frequent
patterns. Frequent patterns lead to the discovery of association rules, strong
rules, sequential episodes, and multi-dimensional patterns. All can play a
critical role in helping corporate and scientific institutions to understand
and analyze their data. Patterns should be discovered in time and space
efficient manner. Discovered patterns have authentic value when they
accurately describe data trends; and, do not exclusively reflect noise or
chance encounters. Vertical data mining algorithms key advantage is that they
can outperform their horizontal counterparts in terms of both time and space
efficiency. Little work has addressed how incomplete data influences vertical
data mining. Consequently, the quality and utility of vertical mining
algorithms results remains ambiguous as real data sets often contain
incomplete data. This paper considers how to establish methodologies that deal
with incomplete data in vertical mining; additionally, it seeks to develop
strategies for determining the maximal utilization that can be mined from a
dataset based on how much and what data is missing.
Key-Words: Incomplete data, vertical, data mining, ignorability, efficiency,
privacy preserving, data sensitivity, maximal utilization, methodologies,
strategies
Title of the Paper: A Comparison of Multi-Agents Competing for Trading
Agents Competition
DOWNLOAD FULL PDF
Authors: Dan Mancas,
Stefan Udristoiu, Ecaterina–irina Manole, Bogdan Lapadat
Abstract: We present a comparative analysis for several multi-agents
participating in Trading Agents Competition, Classic. The game is first
partitioned into separate modules, for which distinct strategies may be
developed. The strategies used are taken into consideration both individually
and in relation with other agents, but also the game medium. Conclusions
regarding possible improvements, better strategies and potential weaknesses
are driven from each agent analysis. Alternatives like static market
algorithms vs. dynamic market algorithms are considered in detail and
advantages and disadvantages are discussed. Also a discussion of TAC market
and possibilities for the stochastic system approximation with a deterministic
one is realized.
Key-Words: Multi – agent, artificial intelligence, autonomous trading agents,
probabilistic market strategy, machine learning, agents
Title of the Paper: Increasing Level of Correctness in Correlation with and
Reliability Level
DOWNLOAD FULL PDF
Authors: Dan Mancas,
Nicolae-Iulian Enescu, Ecaterina-Irina Manole
Abstract: The scope of our research is finding a correlation between the
correctness indicator and the reliability indicator for software programs. For
this, the correctness and reliability indicators will be calculated for a
simple program, written in C programming language. The computations will be
made for each program version obtained by correcting different error type
found in the testing process. Will be observed there is a closed correlation
between correctness and reliability in the way that for an increasing of the
correctness level there will also be a significant increase of the reliability
level.
Key-Words: Testing, correctness, reliability, correlation
Title of the Paper: Evaluating Retina Image Fusion Based on Quantitative
Approaches
DOWNLOAD FULL PDF
Authors: Zhengmao Ye, Hua
Cao, Sitharama Iyengar, Habib Mohamadian
Abstract: Image registration and fusion are conducted using an automated
approach, which applies the automatic adaptation from frame to frame with the
threshold parameters. Rather than qualitative approach, quantitative measures
have been proposed to evaluate outcomes of retina image fusion. Concepts of
the discrete entropy, discrete energy, relative entropy, mutual information,
uncertainty coefficient and information redundancy have been introduced. Both
the Canny edge detector and control point identification are employed to
extract retinal vasculature using the adaptive exploratory algorithms. The
shape similarity criteria have been selected for control point matching. The
Mutual-Pixel- Count maximization based optimal procedure has also been
developed to adjust the control points at the sub-pixel level. Then the global
maxima equivalent result has been derived by calculating Mutual-Pixel-Count
local maxima. For two cases of image fusion practices, the testing results are
evaluated on a basis of information theories where the satisfactory outcomes
have been made.
Key-Words: Image Fusion, Image Registration, Histogram, Discrete Energy,
Discrete Entropy, Relative Entropy, Mutual Information, Uncertainty
Coefficient, Information Redundancy
Title of the Paper: An Object-Oriented Framework with Corresponding
Graphical User Interface for Developing Ant Colony Optimization Based
Algorithms
DOWNLOAD FULL PDF
Authors: Raka Jovanovic,
Milan Tuba, Dana Simian
Abstract: This paper describes GRAF-ANT (Graphical Framework for Ant Colony
Optimization), an object-oriented C# framework for developing ant colony
systems that we have developed. While developing this framework, abstractions
that are necessary for ant colony optimization algorithms were analyzed, as
well as the features that their implementing classes should have. During
creation of these classes, several problems were solved: implementation of
individual ants and ant colonies, connection between visualization and problem
spaces, creation of a multithread application in which multiple ant colonies
can communicate, creation of a problem independent graphical user interface
(GUI), establishing an opportunity for hybridization of ACO (Ant colony
optimization). Effects of this hybridization to different variations of ant
colony systems is analyzed. The use of the GRAF-ANT and its suitability is
illustrated by few instances of the Traveling Salesman Problem (TSP). We also
present a concept of escaping ACO stagnation in local optima, named suspicious
path destruction, that is also a part of GRAF-ANT.
Key-Words: Ant colony system, Evolutionary computing, Combinatorial
Optimization, Swarm Intelligence
Title of the Paper: An Efficient A* Algorithm for the Directed Linear
Arrangement Problem
DOWNLOAD FULL PDF
Authors: Derchian Tsaih,
Guangming Wu, Chiehyao Chang, Shaoshin Hung, Chinshan Wu, Huiling Lin
Abstract: In this paper we present an efficient A* algorithm to solve the
Directed Linear Arrangement Problem. By using a branch and bound technique to
embed a given directed acyclic graph into a layerwise partition search graph,
the optimal directed ordering is then be identified through a A* shortest path
search in the embedding graph. We developed a hybrid DC+BDS algorithm to
approximate the optimal linear arrangement solution, which includes directed
clustering and bidirectional sort technique. Along with a lower bound based on
the maximum flow technique, this approximation solution is used as an upper
bound to prune the state space during the A* search. In order to reduce the
memory requirement of the A* search, we also discuss a implementation of the
relay node technique from Zhou and Hansen [22].
Key-Words: Directed Linear Arrangement, Directed Clustering, A* Search
Title of the Paper: Software Quality and Assurance in Waterfall model and
XP - A Comparative Study
DOWNLOAD FULL PDF
Authors: Sana'a Jawdat
Khalaf, Mohamed Noor Al-Jedaiah
Abstract: Dealing with an increasingly volatile organizational environment is
a serious challenge for managers of any software development .Traditional
formal software development methodologies can be characterized as reflecting
linear, sequential processes ,and the related management approaches ,and be
effective in development software with stable ,known ,consistent requirements
.Yet most realworld development efforts are much more likely to be conducted
in more volatile environments, as organizations adapt to changing technology,
markets, and social conditions. Requirements for systems must be able to
change right along with them, often at “Internet speed” [1]. Project
management approaches based on the traditional linear development
methodologies are mismatched with such dynamic systems. The support of
software quality in a software development process may be regarded under two
aspects: first, by providing techniques, which support the development of high
quality software and second, by providing techniques, which assure the
required quality attributes in existing artifacts. Both approaches have to be
combined to achieve effective and successful software engineering [2]. Agile
methods may produce software faster but we also need to know how they meet our
quality requirements. In this paper we compare the waterfall model with agile
processes to show how agile methods achieve software quality under time
pressure and in an unstable requirements environment, i.e. we analyze agile
software quality assurance. We present a detailed waterfall model showing its
software quality support processes. We then show the quality practices that
agile methods have integrated into their processes. This allows us to answer
the question “Can agile methods ensure quality even though they develop
software faster and can handle unstable requirements?”[3].
Key-Words: Agile processes, Extreme Programming, Waterfall model, Software
development, Software quality, Customer Satisfactions, Customer needs
Title of the Paper: A New Technique for Detecting Dental Diseases by using
High Speed Artificial Neural Networks
DOWNLOAD FULL PDF
Authors: Hazem M.
El-Bakry, Nikos Mastorakis
Abstract: In this paper, a new fast algorithm for dental diseases detection is
presented. Such algorithm relies on performing cross correlation in the
frequency domain between input image and the input weights of fast neural
networks (FNNs). It is proved mathematically and practically that the number
of computation steps required for the presented FNNs is less than that needed
by conventional neural networks (CNNs). Simulation results using MATLAB
confirm the theoretical computations. One of the limitations of Direct Digital
Radiography (DDR) is noise. Some recent publications have indicated that
Digital Subtraction Radiography (DSR) might significantly aid in the clinical
diagnosis of dental diseases, once various clinical logistic problems limiting
its widespread use have been over come. Noise in digital radiography may
result from sources other than variation in projection geometry during
exposure. Structure noise consists of all anatomic features other than those
of diagnostic interest. Limitations of plain radiographs in detecting early,
small bone lesions are also due to the presence of structure noise. This
research work has been under - taken in an attempt to minimize structure noise
in digital dental radiography by using digital subtraction radiography. By
minimizing the structure noise, the validity of the digitized image in
detecting diseases is enhanced.
Key-Words: Direct digital radiography, structure noise, dental bone lesions,
digital subtraction radiography, fast neural networks, cross correlation
Title of the Paper: Fast Evolution of Large Digital Circuits
DOWNLOAD FULL PDF
Authors: Xiaoxuan She
Abstract: Evolvable hardware (EHW) refers to self-reconfiguration hardware
design, where the configuration is under the control of an evolution
algorithm. One of the main difficulties in using EHW to solve real-world
problems is scalability, which limits the size of the circuit that may be
evolved. This paper outlines evolvable hardware based on a 2-LUT (2-input
lookup table) array, which allows the evolution of large circuits via
decomposition. The proposed EHW has been tested with multipliers and logic
circuits taken from the Microelectronics Centre of North Carolina (MCNC)
benchmark library. The experimental results demonstrate that the proposed
scheme improves the evolution of logic circuits in terms of the number of
generations, area and delay, reduces computational time and enables the
evolution of large circuits. The proposed EHW automatically generates a
complete circuit netlist in a SDRAM. Because of the low cost and large data
storage of a SDRAM, the evolvable hardware provides a good platform to evolve
large circuits.
Key-Words: Evolvable hardware, Evolutionary computation, Adaptive systems,
Digital circuits, Problem decomposition, Circuit design
Title of the Paper:
Enhancement of Data Aggregation Using A Novel Point Access Method
DOWNLOAD FULL PDF
Authors: Hung-Yi Lin,
Rong-Chang Chen, Shih-Ying Chen
Abstract: The B+-tree and its variants have been reported as the
good index structures for retrieving data. Database systems frequently
establish the B+-tree style indices for fast access to data
records. However, traditional B+-tree index could be a performance
bottleneck because of its inflatable hierarchy. Many works focus on improving
indexing techniques. In fact, the optimization of data organization inside
index nodes is the most critical factor to improve retrieval quality. Some
handles like pre-partition of data space, node splitting by force, node
splitting with unbalanced partition, and node splitting upon overflow loading
always burden index structures with plenty of storage space and building
overhead. In this paper, we propose a new index scheme to highly aggregate the
external structure in a B+-tree. It also adopts a better splitting
policy to completely remove the suffering from data insertion orders. Our new
index technique can compress data records in leaves and in turn reduce index
size to improve query performance. In addition, the entire index’s space
utilization is promoted to a higher level; thereby the index’s space
requirement becomes smaller and easily resides in memory.
Key-Words: B+-tree, Index structure, B*-tree, Databases, Data
aggregation, Node splitting
Title of the Paper: A New Approach of Secret Key Management Lifecycle for
Military Applications
DOWNLOAD FULL PDF
Authors: Nikolaos Bardis,
Nikolaos Doukas, Konstantinos Ntaikos
Abstract: In this paper a new approach is presented for key management access
and sharing secret keys between certified users of a group. Such schemes are
referred to as Symmetric Key Management Systems. The concept of information
lifecycle management is first presented and analysed in the context of data
storage efficiency. This concept is then extended for use with the management
of symmetric secret keys. The need for a standard in symmetric secret key
management is presented and founded on software engineering principles. A
novel scheme contributing in this direction is hence presented. Specifically,
access controls processes are presented that are based on passwords. These
passwords, with the additional use of the AES cryptographic algorithm and
nonces can be used to provide not only authentication for the access control
in the system but additionally for the access in the encrypted file that
stores all the symmetrical secret keys of each user of certified group.
Following this, a new approach for the lifecycle management of secret keys is
presented in order to achieve the secure communication based on encryption -
decryption of all the messages in real time with the simultaneous use of two
symmetrical secret keys for each transmission of information between the
users. It is finally concluded that this innovative technology guarantees the
automatic password and secret keys management lifecycle irrespective of the
actions of the users and provides secure communication between certified group
of users in local network and in internet.
Key-Words: Key management lifecycle, key management system, access control,
symmetrical secret key
Title of the Paper: Routing Optimization Heuristics Algorithms for Urban
Solid Waste Transportation Management
DOWNLOAD FULL PDF
Authors: Nikolaos V.
Karadimas, Nikolaos Doukas, Maria Kolokathi, Gerasimoula Defteraiou
Abstract: During the last decade, metaheuristics have become increasingly
popular for effectively confronting difficult combinatorial optimization
problems. In the present paper, two individual meatheuristic algorithmic
solutions, the ArcGIS Network Analyst and the Ant Colony System (ACS)
algorithm, are introduced, implemented and discussed for the identification of
optimal routes in the case of Municipal Solid Waste (MSW) collection. Both
proposed applications are based on a geo-referenced spatial database supported
by a Geographic Information System (GIS). GIS are increasingly becoming a
central element for coordinating, planning and managing transportation
systems, and so in collaboration with combinatorial optimization techniques
they can be used to improve aspects of transit planning in urban regions.
Here, the GIS takes into account all the required parameters for the MSW
collection (i.e. positions of waste bins, road network and the related
traffic, truck capacities, etc) and its desktop users are able to model
realistic network conditions and scenarios. In this case, the simulation
consists of scenarios of visiting varied waste collection spots in the
Municipality of Athens (MoA). The user, in both applications, is able to
define or modify all the required dynamic factors for the creation of an
initial scenario, and by modifying these particular parameters, alternative
scenarios can be generated. Finally, the optimal solution is estimated by each
routing optimization algorithm, followed by a comparison between these two
algorithmic approaches on the newly designed collection routes. Furthermore,
the proposed interactive design of both approaches has potential application
in many other environmental planning and management problems.
Key-Words: Ant Colony System, ArcGIS Network Analyst, Waste Collection,
Optimization Algorithms, Routing, Simulation
Title of the Paper: Prediction of Domestic Warm-Water Consumption
DOWNLOAD FULL PDF
Authors: Elena Serban,
Daniela Popescu
Abstract: The paper presents methodologies able to predict dynamic warm water
consumption in district heating systems, using time-series analysis. A
simulation model ccording to the day of a week has been chosen for modeling
the domestic warm water consumption in a block of flats with 60 apartments.
The analysis of the residuals indicates good simulation and prediction models
for the cases studied. Double-cross validation was done using data collected
by the SCADA system from District Heating Company of Iasi.
Key-Words: District heating systems, domestic warm water, time series models,
autoregressive model, simulation, prediction
Title of the Paper: An Algorithm for Minimum Flows
DOWNLOAD FULL PDF
Authors: Oana Georgescu,
Eleonor Ciurea
Abstract: In this paper we consider an advanced topic for minimum flow
problem: the use of the dynamic trees data structure to efficiently implement
decreasing path algorithm. In the final part of the paper we present an
example for this algorithm, some practical applications for the minimum flow
problem, conclusions and open problems.
Key-Words: Network flow, network algorithms, minimum flow problem, dynamic
tree
|
|
|