|
|
Issue
1, Volume 10, January 2011
Title of the Paper: Frequencies
of Propagation of Electromagnetic Waves in a Hexagonal Waveguide
DOWNLOAD
FULL PDF
Authors:
Arti Vaish, Harish Parthasarathy
Abstract: In this work, cut-off frequencies of propagation of electromagnetic
waves in a hexagonal waveguide are calculated using two-dimensional (2-D)
finite element method. The numerical approach is a standard one and involves
six finite elements. A new type of hexagonal waveguide structure for the
simple homogeneous dielectric case has been considered. The starting point is
Maxwell’s equations in conjunction to the exponential dependence of the fields
on the Z- coordinates. For the homogeneous case, it results in the Helmholtz
equations. Finally, finite element method has been used to derive approximate
values of the possible propagation constant for each frequency.
Keywords: Finite-element-method, Variational principle, Eigenvector, Matrix
Equation, frequencies of propagation, hexagonal waveguide
Title of the Paper: Secure
and Highly Efficient Three Level Key Management Scheme for MANET
DOWNLOAD
FULL PDF
Authors:
Wan An Xiong, Yao Huan Gong
Abstract: MANET(Moving Ad hoc Network) is a convenient infrastructure-less
communication network which is commonly susceptible to various attacks. Many
key management schemes for MANETs are presented to solve various security
problems. Identity (ID)-based cryptography with threshold secret sharing ,ECC
and Bilinear Pairing computation is a popular approach for the key management
design. In this article, we adopt these approaches to construct tree structure
and cluster structure ad hoc network which has three level security
communication framework. After constructing the security structure, we
evaluate the security performance and efficiency of the scheme in detail.
Keywords: Three Level Key Management, Elliptic Curve Cryptography, Bilinear
Pairing Computation, (n,t) Threshold Key Distribution, ID-based key management
Title of the Paper: Selection
of Polynomials for Cyclic Redundancy Check for the use of High Speed Embedded
– An Algorithmic Procedure
DOWNLOAD
FULL PDF
Authors:
A. Ahmad, L. Hayat
Abstract: Cyclic Redundancy Check (CRC) technique which is widely used tools
in globally standardized telecommunications systems for dealing with data
errors detection and correction have not been fully standardized. Most of the
CRCs in current use have some weakness with respect to strength or
construction. Standardization of CRCs would allow for better designed CRCs to
come into common use is primarily limited due to the complexity of search
procedures of the primitive characteristic polynomials. To this direction this
paper proposes a method of simplifying the computation and complexity of the
search procedure of the primitive characteristic polynomials in order to
facilitate implementation of the circuitry for high-speed CRC computation in
standard CMOS technology.
Keywords: Cyclic Redundancy Check, CRC, Linear Feedback Shift Registers,
LFSRs, Primitive Polynomial Primitive Characteristic Polynomial, Power
Dissipation, Exclusive-OR, D-Flip-Flop
Title of the Paper: Performance
Evaluation of Distributed Database on PC Cluster Computers
DOWNLOAD
FULL PDF
Authors:
Sorapak Pukdesree, Vitalwonhyo Lacharoj, Parinya Sirisang
Abstract: Presently, Information is very importance aspect to be recognized on
every application. Modern organizations have stored and managed their
information using database management system. The proprietary DBMSs Software
is very expensive license to spend depending on the scale of capability to
handle their transactions. Therefore this research would like to represent the
distributed database methodology that can be scalable to improve performance
the database system to meet business requirements. To implement the
distributed database methodology, researcher will use an open source DMBS
named MySQL Cluster as research’s tool. MySQL Cluster deploys on distributed
database technology that can be scaled the performance dynamically on the PC
Clustering computers. MySQL Cluster can provide higher performance with
significantly lower cost than enterprise DBMSs based on PC Clustering
computers. This research focuses on the small and medium of enterprise
businesses in Thailand which their incomes are less than one and a half
million dollar per year. Most of their budget have been spent on productions
rather than invested on information technology section. Therefore SMEs
businesses in Thailand can utilize this research’s information to make their
plans for the database management system to meet the requirements of their
businesses.
Keywords: High Performance Computing, PC Clustering Computers, Database, MySQL
Cluster, Distributed Database, Distributed Processing
Issue
2, Volume 10, February 2011
Title of the Paper: Predict
Strength of Rubberized Concrete Using Atrificial Neural Network
DOWNLOAD
FULL PDF
Authors:
A. Abdollahzadeh, R. Masoudnia, S. Aghababaei
Abstract: In this paper, behaviour of rubberized concrete was modelled using
artificial neural network ANN and obtained results were compared to
experimental data. Experimental test include recycling 5, 10, 15 and 20%
percentage of concrete aggregate with different powder size 0.2, 0.4, 0.6, 0.8
mm of rubber. Results demonstrate high ability of ANN in Prediction of
compressive strength of rubberized concrete compared to MLR (R2= 0.9650 and
RMSE=0.017). Finally, the performance of each model was evaluated using the
Root Mean Square Error (RMSE), Correlation Coefficient(R), Correlation of
determination (R2), and Mean Absolute Relative Error (MARE).
Keywords: Rubberized concrete, Artificial neural network, Multi linear
regression, Root Mean Square
Title of the Paper: New
Standards for Competitive Distinctions: A Practical Model
DOWNLOAD
FULL PDF
Authors:
Edson Pacheco Paladini, Fabricia Goncalves De Carvalho
Abstract: This paper discusses how to create a knowledge modeling processes
for strategic management. Innovation is the main strategy for the management
approach. Basic concepts of knowledge management were used to support a
proposed model that involves people in strategic decisions. The search for
solutions to problems in the field of innovation management is justified by
the stiff competition companies now face. In this situation, corporations need
to transform their culture by giving incentives to the search for creative and
innovative solutions generated by their human resources. This position is
essential for organizations that attempt to create new standards of action and
establish competitive distinctions.
Keywords: Strategic decisions; knowledge modeling; innovation management
Title of the Paper: Predict
Soil Erosion with Artificial Neural Network in Tanakami (Japan)
DOWNLOAD
FULL PDF
Authors:
A. Abdollahzadeh, M. Mukhlisin, A. El Shafie
Abstract: In recent years using artificial neural networks has increased as
powerful tool with capability to predict linear and nonlinear relationships in
complex engineering problems. Using this toolbox has been significant in
different civil engineering fields, especially hydrological problems for
various important parameters with different variables and complex mathematical
equation. Predict soil erosion has been studied as one of the important
parameters of the catchment management in this study. To obtain data
artificial rainfall was used in a catchment located in Jakujo Rachidani in
Tanakami area. Artificial network has developed foe predict soil erosion and
this results compared with obtained results from Multi Linear Regression (MLR)
. The results show high ability of ANN to Prediction of soil erosion compared
to MLR. The performance of each model is evaluated using the Mean Square Error
(MSE), Root Mean Square Error (RMSE) Correlation Coefficient(R), Correlation
of determination (R2), Mean Absolute Relative Error (MARE).
Keywords: Soil Erosion, Catchment, Modeling, Artificial neural network, Multi
linear regression, Mean Square Error
Title of the Paper: Orthogonal
Software Architecture Design for Radar Data Processing System with
Object-oriented Component and COM Interface
DOWNLOAD
FULL PDF
Authors:
Zhongzhi Li, Xuegang Wang, Xuelian Yu
Abstract: Large scale software system is usually developed by software
engineering method, and it needs good architecture and reusable components.
Radar data processing system is a complex software system; it needs to
complete many tasks such as multi-sensor data fusion, target tracking, data
storing and displaying, remote controlling, etc. Based on orthogonal software
architecture and component-based software engineering, we propose a new
method, orthogonal software architecture with object-oriented component and
COM interface in this paper, and we use the proposed method to complete the
architecture and components design for radar data processing software system.
By eliminating correlation between components, we can improve the reusability
and maintainability of component. At the same time, we use COM interface to
implement mixed language programming and system integration. After the system
development and test, it proves that the new software architecture is
reasonable and applicable.
Keywords: Orthogonal software architecture; Component-based software
engineering; Object-oriented component; Component object model (COM); Module;
Radar data processing system
Issue
3, Volume 10, March 2011
Title of the Paper: A
Comparative Analysis of Methods for Probability Estimation Tree
DOWNLOAD
FULL PDF
Authors:
Na Chu, Lizhuang Ma, Ping Liu, Yiyang Hu, Min Zhou
Abstract: In this paper, we address the problem of probability estimation of
decision trees. This problem has received considerable attention in the areas
of machine learning and data mining, and techniques to use tree models as
probability estimators have been suggested. We make a comparative study of six
well-known class probability estimation methods, measured by classification
accuracy, AUC and Conditional Log Likelihood (CLL). Comments on the properties
of each method are empirically supported. Our experiments on UCI data sets and
our liver disease data sets show that the PETs algorithms outperform
traditional decision trees and naive Bayes significantly in classification
accuracy, AUC and CLL respectively. Finally, a unifying pseudocode of
algorithm is summarized in this paper.
Keywords: Probability estimation tree, Decision trees, Classification, Joint
distribution, AUC, Conditional log likelihood
Title of the Paper: The
Influence of Antecedent Factors of IS/IT Utilization Towards Organizational
Performance - A Case Study of IAIN Raden Fatah Palembang
DOWNLOAD
FULL PDF
Authors:
Rika Kharlina Ekawati, Achmad Nizar Hidayanto
Abstract: Information technology is one thing that is important in supporting
the operational success of an organization. In an uncertain environment,
information is needed primarily to support the performance of organization in
decision-making. Information Systems is an orderly combination of human,
hardware, software and communication network of data resources, which collect,
modify, and distribute information within an organization to support
organizational decision-making and control. But before the IS/IT is
implemented, it is worth considering the antecedent factors that may be used
as reference to see the history before the IS/IT is implemented, whether
antecedent factors of IS/IT implementation has correlation and influence on
organizational performance. Antecedent factors consist of six aspects, which
are social factors, attitudes, support conditions, system complexity,
long-term consequences and habits. This study aims to find empirical evidence
that there is correlation and influence between antecedent factors of IS/IT
implementation and organizational performance. Using Pearson correlation
analysis and Regression analysis for the testing, the results obtained showed
that attitude, facilitating conditions and system complexity have correlation
with organizational performance. Among these, only attitude and facilitating
condition that influence organizational performance.
Keywords: Information System, Information Technology, Antecedent Factors,
Organizational Performance, Utilization Model.
Title of the Paper: A
Study on Factors Influencing Power Consumption in Multithreaded and Multicore
CPU’s
DOWNLOAD
FULL PDF
Authors:
Vijayalakshmi Saravanan, Senthil Kumar Chandran, Sasikumar Punnekkat, D. P.
Kothari
Abstract: The ever-growing demand for computational power and high performance
has led to a rapid growth in the semiconductor industry. This evolution has
seen a continuous increase in CPU performance and the number of transistors on
a chip has roughly doubled every two years – proving Moore’s law. An
inevitable consequence when achieving this is that more functional units,
deeper pipelining and larger cache sizes have had to be implemented on the CPU
chip. The result is a significant increase in the power consumption. Achieving
high performance with low power consumption has been the traditional goal in
high-end processors. In order to accomplish high performance, multithreaded
and multicore CPUs have become the recent trend in semi-conductor technology.
The purpose of this paper is to statistically analyse various factors that
affect power, to study their relationship, and to quantify their influence on
power consumption in multithreaded and multicore CPUs. The paper also
discusses recent advancements in power savings through the implementation of
power-limiting micro-architectural features (e.g. out-of-order execution,
branch prediction, caching and prefetching) in contemporary multi-core
processors, such as Intel Nehalem and AMD’s Istanbul processors.
Keywords: Power consumption, statistical analysis, power-limiting factors
Issue
4, Volume 10, April 2011
Title of the Paper: A New
Immune Clone Algorithm to Solve the Constrained Optimization Problems
DOWNLOAD
FULL PDF
Authors:
Liang Zhou, Jianguo Zheng
Abstract: In recent years, the constrained optimization problems have become a
hot topic among the interest of scholars. In this paper, a new improved
artificial immune algorithm is proposed and then used for solving constrained
optimizations problems (COPs). This algorithm will treat these COPs as
multi-objective optimization problems, and it is based on the concept of
Pareto optimization to solve COPs. The mechanism of clone is imported into
this new immune algorithm, at the same time, the new improved immune algorithm
consists some new concepts, such as linear non-equilibrium recombination
operator and preference difference, which can build an efficient immune model
for solving this kind of multi-object problems. Finally, simulation on some
test functions show that the new immune clone algorithm can obtain better
results compared with the existing algorithms.
Keywords: Constrained optimization, Multi-object optimization, linear
non-equilibrium recombination operator, immune, clone, preference difference,
Pareto optimization
Title of the Paper: Performance
Evaluation of Artificial Neural Networks for Spatial Data Analysis
DOWNLOAD
FULL PDF
Authors:
Akram A. Moustafa, Ziad A. Alqadi, Eyad A. Shahroury
Abstract: The artificial neural network training algorithm is implemented in
MATLAB language. This implementation is focused on the network parameters in
order to get the optimal architecture of the network that means (the optimal
neural network is the network that can reach the goals in minimum number of
training iterations and minimum time of training). Many examples were tested
and it was shown that using one hidden layer with number of neuron equal to
the square of the number of inputs will lead to optimal neural network by mean
of reducing the number of training stages (number of training iterations) and
thus the processing time needed to train the network.
Keywords: Artificial neural network (ANN), Back-propagation, training rate and
training iteration (epochs), hidden layer, net simulation, multilayer
perceptron (MLP)
Title of the Paper: Improving
Arabic Information Retrieval System using N-Gram Method
DOWNLOAD
FULL PDF
Authors:
Rammal Mahmoud, Sanan Majed
Abstract: This paper presents the application of the indexing method and the
Retrieval systems based on N-grams to the Arabic legal language used in
official Lebanese government journal documents. In our work we have used
N-gram as a representation method, based on words and characters, and then
compared the results using the vector space model with three similarity
measures: the TF*IDF weighting, Dice's coefficient and the Cosine Coefficient.
The experiments demonstrate the use of trigrams to index Arabic documents is
the optimal choice for Arabic information retrieval using N-grams. But using
N-grams to indexing and retrieval legal Arabic documents is still insufficient
in order to obtain good results and it is indispensable to adopt a linguistic
approach that uses a legal thesaurus or ontology for juridical language.
Keywords: Arabic language, Indexing, N-grams, Information Retrieval, Word
segmentation
Issue
5, Volume 10, May 2011
Title of the Paper: Designing
Test Engine for Computer-Aided Software Testing Tools
DOWNLOAD
FULL PDF
Authors:
Xue-Ying Ma, Bin-Kui Sheng
Abstract: With the rapid development of software scale and programming
languages, it is impossible to test software manually. The case for automating
the software testing process has been made repeatedly and convincingly by
numerous testing professionals. Automated tests can promote the efficiency of
software testing and then to increase software productivity, improve software
quality, and reduce cost in almost all processes of software engineering.
White-box testing is one of the most important software testing strategies
that can detect error even when the software specification is vague or
incomplete. This paper gives a detailed description of the design and
implementation of a testing engine. The testing engine, which is the kernel of
a developed structured software-testing tool for the Visual Basic and C/C++
language, mainly consists of three components: program analyzer, source code
instrumentation tool and intermediate database. In the testing engine, a block
division mechanism and a new block-based CFG model are introduced and some
block-based test adequacy criteria are extended. The programs are divided into
a sequence of blocks and then instrumented and compiled in the testing engine,
and all the information related to the test is saved in the intermediate
database. The testing engine, acting as an agency, associates the testing
automation module with instrumented executable program rather than the source
code, and therefore the testing tool can easily be developed to accommodate
new requirements and different testing adequacy criteria. It is also
convenient to build a testing environment for multi-languages by modifying the
program analyzer only, due to the flexibility of the software architecture.
Keywords: Computer-aided software test, testing engine, program
instrumentation, Intermediate database, object-oriented software-testing
Title of the Paper: The
Computer Aided Analysis of the Bus Accidents Oriented to the Numerical
Simulation of the Injury of the Human Body
DOWNLOAD
FULL PDF
Authors:
Xiao-Yun Zhang, Xian-Long Jin, Jie Shen
Abstract: While bus accidents tend to draw public concerns in China, much
recent research has only focused on the analyses of car accidents due to
relative high rates. However, the research dedicated itself to the scope of
reconstructing and analyzing traffic accidents involving bus and quasi bus
vehicle. Thus, the paper here is to represent a comprehensive method for the
reconstruction of bus accidents, introducing analysis of human body injury as
an auxiliary approach to verify the results of simulation in order to improve
the accuracy of whole judgments, apart from using the technique of trajectory
optimization as conventional reconstructions of car accidents, which ignore
human body injury. According to clinical results and information collecting
and concluding from the accident sites, the studies of body injury, which work
as a kind of feedback in order to check and guide ordinary simulations, were
carried out investigating the severity levels and dynamic response of human
body under the given conditions calculated from common method. Within the
method, the corresponding modifications of modeling, calculating and
simulating need to be made, relating to the comparisons between predicted
injury parameters and practical effects on victims. Through the demonstrations
of the reconstructions and analyses of two real-life paradigms regarding bus
accident, this paper indicates the general routing of the method for common
cases. The research looks at applying two useful numerical reconstruction
techniques, namely Multi-body body dynamics and trajectory optimization
methods. With the help of two numerical modeling skills, preliminary results
indicate that the combined reconstruction method can reflect the process of
bus accident reasonably well. In comparison with conventional methods, the
method provides more reliability as well as accuracy.
Keywords: Bus accident; accident reconstruction; injury; occupant kinematics;
trajectory optimization
Title of the Paper: An
Optimization for the Design of a Simple Asynchronous Processor
DOWNLOAD
FULL PDF
Authors:
Sun-Yen Tan, Wen-Tzeng Huang
Abstract: The asynchronous circuit style is based on micropipelines, a style
used to develop asynchronous microprocessors at Manchester University. This
paper has presented some engineering work on developing a micropipeline Stump
processor. The work presented in this paper demonstrates that VHDL can be used
to describe the behaviour of micropipelined systems. It also shows a
comparison of 2-phase and 4-phase implementations in transistor count, speed,
and energy. Though the nature of the work is mainly engineering, there are
some significant new insights gained in the course of the work. The 2-phase
circuits have good performance in speed. This is due to the rising and falling
transitions of the 4-phase circuits following the same routes. Asymmetric
delays with fast reset circuit can be applied to improve the performance. The
fastest speed is 1.55 MIPS for the two-phase synthesized processor and the
lowest power consumption is 362.33 fj for the synthesized four-phase long hold
processor.
Keywords: Asynchronous design, Micropipelines, Processor, VHDL, Synthesis
Issue
6, Volume 10, June 2011
Title of the Paper: A
Novel Image Encryption Algorithm Using Pixel Shuffling and BASE 64 Encoding
Based Chaotic Block Cipher (IMPSBEC)
DOWNLOAD
FULL PDF
Authors:
G. A. Sathishkumar, K. Bhoopathy Bagan
Abstract: The image encryption is widely used to secure transmission of data
in an open internet and internet works. Each type of data has its own unique
features; therefore different data requires a different type of encryption
algorithm. Most of the present day techniques are suitable for textual data
and they are not suitable for multi- media content rich data such as images.
Combined with nonlinear dynamic (chaotic) maps, a new algorithm is developed
and applied to image based cryptosystems. In this proposed algorithm, we
propose a pixel shuffling, base 64 encoding based algorithm, which is a
combination of block permutation, pixel permutation and value transformation.
In general, diffusion and permutation is performed in an iterative fashion.
These two methods are opened and operated alternatively in every round of
encryption process; at least four such chaotic sub keys are employed in every
round of primitive encryption process. Decryption has the same structure,
which operates in reverse order. The statistical analysis shows that the
proposed algorithm has good immunity to various attacks and it is suitable for
various software and hardware applications. A new approach is proposed to
generate a random-bit sequence with a high degree of randomness. The proposed
algorithm is a better alternative to satisfy the need for information security
services. The performance analysis of the proposed new approach is tested for
randomness by carrying out various testing rules and statistical test. Results
of the various types of analysis are encouraging and imply that the proposed
approach is very successfully able to adeptly trade offs between the speed and
protection. Hence it is suitable for the real-time transmission of image and
wireless communication applications.
Keywords: Image encryption, Base 64 encoding, chaotic maps, logistic map and
block cipher
Title of the Paper: LDAG:
A New Model for Grid Workflow Applications
DOWNLOAD
FULL PDF
Authors:
Guiping Wang, Yan Wang
Abstract: Grid workflow and its application are one of main focuses of Grid
Computing. Due to data or control dependencies between tasks and the
requirement of no directed circuit, Directed Acyclic Graph (DAG) is a natural
model for Grid workflow, and has been extensively used in Grid workflow
modeling. For some workflow applications, there may exist another requirement
that each task should be accomplished at an expected stage, that is, at a
given level. In this paper, we discuss such workflow applications in depth,
and propose a new DAG model, which we called LDAG. In LDAG, each node
possesses a level. Several cases of the level of nodes are discussed in
detail. For a reasonable one of these cases, we propose the topological
sorting algorithm. The algorithm consists of two phases, namely Level
Adjusting and Topological Sorting. We discuss some relevant problems, such as
choice of stack or queue, the determination of directed circuit, complexity of
the algorithm, etc. The experiment and analysis of LDAG and topological
sorting algorithm show its correctness and efficiency in modeling grid
workflow.
Keywords: Directed Acyclic Graph (DAG), LDAG; Grid workflow; Level; Topologic
sorting; Directed circuit
Title of the Paper: Certain
Investigation on MRI Segmentation for the Implementation of CAD System
DOWNLOAD
FULL PDF
Authors:
J. Jaya, K. Thanushkodi
Abstract: The aim of this work is to develop Computer Aided Diagnosis (CAD)
system for the detection of brain tumor by using parallel implementation of
ACO system for medical image segmentation applications due to the rapid
execution for obtaining and extracting the Region of Interest (ROI) from the
images for diagnostic purposes in medical field. For ROI segmentation,
metaheuristic based Parallel Ant colony Optimization (PACO) approach has been
implemented. The system has been simulated in the Mat lab for the parallel
processing, using the master slave approach and information exchange. The
scheme is tested up to 10 real time MRI brain images. Here parallelism is
inherent in program loops, which focused on performing searching operation in
parallel. The computational results shows that parallel ACO systems uses the
concept of the parallelization approach enabled the utilization of the
intensity similarity measurement technique because of the capability of
parallel processing. Medical image segmentation and detection at the early
stage played vital roles for many health-related applications such as medical
diagnostics, drug evaluation, medical research, training and teaching. Due to
the rapid progress in the technologies for segmenting digital images for
diagnostic purposes in medical field parallel Ant based CAD system are
technologically feasible for Medical Domain which will certainly reduce the
mortality rate.
Keywords: ACO, CAD system, MRI, PACO, ROI and Segmentation
Issue
7, Volume 10, July 2011
Title of the Paper: Applying
Data Mining and Grey QFD to Mine the Dynamic Trends for Computer Life
Cycle-oriented Green Supply
DOWNLOAD
FULL PDF
Authors:
Chih-Hung Hsu, An-Yuan Chang, Hui-Ming Kuo
Abstract: Green products can reduce the environmental burden during design and
disposal. The most approved technique to evaluate the environmental profile of
a green product is the life cycle assessment. Data mining has also been
successfully applied in many fields. However, little research has been done in
the quality function deployment of mining the dynamic trends of customer
requirements and engineering characteristics, using data mining and grey
theory. This study proposed an approach to use data mining and grey theory in
quality unction deployment for mining dynamic trends of the computer life
cycle-oriented green supply. An Empirical example is provided to demonstrate
the applicability of the proposed approach. Certain advantages may be observed
when the dynamic and future requirements trends were identified, using the
proposed approach. Since CRs can change rapidly, the database of CRs must be
updated continually; therefore, the proposed approach in this study, will
continually mine the database and identify the dynamic trends for the
designers and manufacturers. The results of this study can provide an
effective procedure of mining the dynamic trends of CRs and ECs for improving
customer satisfaction and green competitiveness in the marketplace.
Keywords: Data mining, Grey theory, Quality function deployment, Dynamic
trends, Life cycle, Green supply
Title of the Paper: Cryptanalysis
of Simplified-DES using Computational Intelligence
DOWNLOAD
FULL PDF
Authors:
Vimalathithan R., M. L. Valarmathi
Abstract: Cryptanalysis with Computational Intelligence has gained much
interest in recent years. This paper presents an approach for breaking the key
used in Simplified-Data Encryption Standard (S-DES) using Genetic algorithm
(GA), Particle Swarm Optimization (PSO) and a novel approach called Genetic
Swarm Optimization (GSO) obtained by combining the effectiveness of GA and
PSO. Ciphertext-only attack is embraced here and an optimum key is produced
based on Letter Frequency analysis as Cost function. The key is optimized
using the capabilities of Computational Intelligence and the experimental
results indicate GSO is an effective tool which runs through less time to
break the key used in S-DES and reduces the search space nearly by a factor of
6.
Keywords: Cryptanalysis, ciphertext-only attack, Genetic Algorithm, Particle
Swarm Optimization , Genetic Swarm Optimization , cost, plaintext and
ciphertext
Title of the Paper: Enhancements
to Reputation Based Trust Models for Improved Reliability in Grid Computing
DOWNLOAD
FULL PDF
Authors:
Srivaramangai P., Rengaramanujam Srinivasan
Abstract: A Grid integrates, coordinates resources and users from different
domains. Grid computing is an interconnected computer system, where machines
share resources that are highly heterogeneous. Grid computing and its related
technologies will only be adopted by users, if they are confident that their
data and privacy are secured, and the system is as scalable, robust and
reliable as of their own, in their places. Trust and reputation systems have
been recognized as playing an important role in decision making on the
internet. Reputation based systems can be used in a Grid to improve the
reliability of transactions. Reliability is the probability that a process
will successfully perform its prescribed task without any failure at a given
point of time. Hence, ensuring reliable transactions plays a vital role in
grid computing. To achieve reliable transactions, mutual trust must be
established between the initiator and the provider. Trust is measured by using
reputation, where reputation is the collective opinion of others. The main
purpose of security mechanisms in any distributed environment such as the Grid
is to provide protection against malicious parties. There is a whole range of
security challenges that are yet to be met by traditional approaches.
Traditional security mechanisms such as authentication, and authorization,
typically protect resources from malicious users, by restricting access to
only authorized users. However, in many situations users have to protect
themselves from those who offer resources so that the problem, in fact, is
reversed. Information providers can deliberately mislead by providing false
information; traditional security mechanisms are unable to protect against
this type of security threat. Trust and reputation systems, on the other hand,
can very well provide protection against such threats. Reputation models can
be modeled in such a way they it could provide reliability for both users and
providers. Reputation systems provide a way for building trust through social
control, by utilizing community based feedbacks about past experiences of
peers to help making recommendations and judgments on the quality and
reliability of the transactions. Reputation and trust systems are soft
security mechanisms which can assure behavior conformity. In this paper two
new reputation based trust models are proposed. The first, model, Model 1,
uses a new factor called compatibility, which is based on Spearman’s rank
correlation. The feed backs of the recommenders which are incompatible with
those of the initiator are eliminated by using the compatibility factor. Model
2 is an improvement over the Model 1. In this model, new factors are included
for measuring the direct trust. In order to effectively evaluate the
trustworthiness of different entities and to address various malicious
behaviors, this comprehensive trust model based on reputation, is proposed.
Two important factors – context and size, are incorporated in evaluating the
trustworthiness of entities.
Keywords: Grid computing, Reputation, Trust, Reliability
Issue
8, Volume 10, August 2011
Title of the Paper: Market
Information Needs Risk Assessment toward ICT Usage for Green Bean Producers in
Dakar Region of Senegal
DOWNLOAD
FULL PDF
Authors:
Wen-I Chang, Chao-Lin Tuan
Abstract: In Senegal, information and communication technologies (ICTs) have
been applied to accelerate the development of horticulture. Farmers can access
information about weather, market price, and production volume through the
ICT-based information systems. However, little is known about the nature and
limitations of actual ICT usage among farmers in rural areas. Green bean is
one of the dominant garden crops in Senegal, and market information is crucial
to its farm management. Therefore, this study aims to assess the marketing
risks, ICT usage, and information needs of green bean producers to further
promote the use of ICT-based information systems. A survey was conducted in
Dakar Region, the chief production area of green bean in Senegal. From the
results of this study, it is found that perishability and competition were the
main marketing risks of green bean producers. Mobile phone and telecentre were
the most commonly adopted ICT in their daily life. Their key information needs
included wholesale, retail, and input prices. Language and cost were the major
limiting factors in further usage of ICT. Furthermore, female producers showed
vulnerability in price risk. Younger producers appeared to have relatively
higher usage of TV and household telephone while older producers had higher
usage of radio usage. Similarly, higher education was positively correlated to
higher information needs on weather and agricultural policy. Among ethnic
groups, Serer and other ethnic minority groups appeared to be more vulnerable
to marketing risks. Members of producers’ associations seemed to have less
concern about marketing risks and higher radio usage. Meanwhile, telecentre
users showed higher marketing risks and greater information needs, indicating
the telecentre as one of the key media to assist the vast uses. In sum, the
findings of this study suggested tailored information requires handy media and
proper format to reach rural producers. Based on the results, there is a
necessity to develop an information system supported by voice service in local
dialects as well as reliable and cost effective power sources. Finally, a
research model for horticultural market information systems is also proposed
to meet users’ needs and enhance growth opportunities for horticulture
industry in Senegal.
Keywords: Green bean, horticultural marketing, ICT usage, information needs,
marketing risks, Senegal
Title of the Paper: Agent
Based Load Balancing Scheme using Affinity Processor Scheduling for Multicore
Architectures
DOWNLOAD
FULL PDF
Authors:
G. Muneeswari, K. L. Shunmuganathan
Abstract: Multicore architecture otherwise called as CMP has many processors
packed together on a single chip utilizes hyper threading technology. The main
reason for adding large amount of processor core brings massive advancements
in parallel computing paradigm. The enormous performance enhancement in
multicore platform injects lot of challenges to the task allocation and load
balancing on the processor cores. Altogether it is a crucial part from the
operating system scheduling point of view. To envisage this large computing
capacity, efficient resource allocation schemes are needed. A multicore
scheduler is a resource management component of a multicore operating system
focuses on distributing the load of some highly loaded processor to the
lightly loaded ones such that the overall performance of the system is
maximized. We already proposed a hard-soft processor affinity scheduling
algorithm that promises in minimizing the average waiting time of the non
critical tasks in the centralized queue and avoids the context switching of
critical tasks. In this paper we are incorporating the agent based load
balancing scheme for the multicore processor using the hard-soft processor
affinity scheduling algorithm. Since we use the actual round robin scheduling
for non critical tasks and due to soft affinity the load balancing is done
automatically for non critical tasks. We actually modified and simulated the
linux 2.6.11 kernel process scheduler to incorporate the hard-soft affinity
processor scheduling concept. Our load balancing performance is depicted with
respect to different load balancing algorithms and we could realize the
performance improvement in terms of response time against the various
homogeneous and heterogeneous load conditions. The results also shows the
comparison of our agent based load balancing algorithm against the traditional
static and dynamic sender, receiver initiated load balancing algorithms.
Keywords: Hard Affinity, Soft Affinity, Scheduler, Middle Agent, Processor
Agent, Multicore Architecture, Scheduling, Agent Control Block , Load
balancing, Response time
Title of the Paper: Image
Analysis Based on the Discrete Magnetic Field Generated by the Virtual Edge
Current in Digital Images
DOWNLOAD
FULL PDF
Authors:
X. D. Zhuang, N. E. Mastorakis
Abstract: In this paper, the spatial property of the magneto-static field
generated by the stable current is discussed and exploited in image analysis.
The region-division feature of the magnetic field generated by a current
element on 2D plane is investigated experimentally for some simple test
images. The virtual edge current in gray-scale images is presented by a
magneto-static analogy, which is composed of the tangent edge vectors as a
discrete form of the physical current element. The virtual magnetic field
generated by the edge current in digital images is investigated
experimentally, which is applied in region border detection and region
division. A novel image segmentation method is proposed based on the virtual
magnetic field generated by the edge current. The experimental results prove
the effectiveness of the proposed method, and also indicate the promising
application of the physics-inspired methods in image processing tasks.
Keywords: Image analysis, virtual edge current, magnetic field, tangent edge
vector, image segmentation
Issue
9, Volume 10, September 2011
Title of the Paper: Evaluating
Peer Behaviour in Distributed Participatory Sensing
DOWNLOAD
FULL PDF
Authors:
Ramaprasada R. Kalidindi, Kvsvn Raju, V. Valli Kumari, C. S. Reddy
Abstract: Recent advances in ubiquitous computing and availability of low cost
sensors have led to the wide spread use of sensor networks in civilian
applications. These networks along with multisensory personal devices generate
lot of data in digital domain. Harnessing this data for urban sensing
applications reduces the cost of implementation. This is possible when people
share their data as a community service. However, people hesitate to
participate because of trust deficit. Instilling trust among the participants
will enhance people?s participation and make a way for newer applications to
share data among people. This paper describes a model for data sharing by
computing confidence among networked peers. The social interactions in digital
domain and reputation in community establish goodwill among peers. This
goodwill and the trust on various control factors that influence a peer are
used to evaluate its behaviour. However, trusting on peer?s behaviour may
involve risk otherwise there is an opportunity. More of opportunity than risk
induces confidence on a peer. Finally, this confidence in peer decides whether
to share data or not.
Keywords: Trust management, Privacy control, Risk, Behaviour aware computing,
Participatory sensing, Urban sensing
Title of the Paper: On
Performance Analysis of Hybrid Algorithm (Improved PSO with Simulated
Annealing) with GA, PSO for Multiprocessor Job Scheduling
DOWNLOAD
FULL PDF
Authors:
K. Thanushkodi, K. Deeba
Abstract: Particle Swarm Optimization is currently employed in several
optimization and search problems due its ease and ability to find solutions
successfully. A variant of PSO, called as Improved PSO has been developed in
this paper and is hybridized with the simulated annealing approach to achieve
better solutions. The hybrid technique has been employed, inorder to improve
the performance of improved PSO. This paper shows the application of hybrid
improved PSO in Scheduling multiprocessor tasks. A comparative performance
study is reported. It is observed that the proposed hybrid approach gives
better solution in solving multiprocessor job scheduling.
Keywords: PSO, Improved PSO, Simulated Annealing, Hybrid Improved PSO, Job
Scheduling, Finishing time, waiting time
Title of the Paper: Automatic
Edge Detection using Vector Distance and Partial Normalization
DOWNLOAD
FULL PDF
Authors:
Shuhan Chen, Weiren Shi, Kai Wang
Abstract: This paper proposes a novel edge detection method for both gray
level images and color images, and which can overcome the limitations of
gradient-based edge detection methods. A vector distance between feature
vector and minimum vector which determines the edge intensity is defined based
on four directional summed magnitude differences in a mask, and partial
normalization is applied to facilitate threshold selecting. This paper also
proposes an improved approach to determine the edge direction. According to
the improved edge direction, non-maxima suppression is applied to thin edges,
and final edges are extracted automatically using OTSU, even in a changing
environment. Extensive experimental results have demonstrated that the
proposed method does well in keeping low-contrast edges, selecting threshold
and processing time.
Keywords: Edge detection, Color edge detection, Vector distance, Partial
normalization, Non-maxima suppression
Title of the Paper: SWIDE:
Semantic Web Integrated Development Environment
DOWNLOAD
FULL PDF
Authors:
Islam Hany Harb, Abdurrahman A. Nasr, Salah Abdel-Magid, Hany Harb
Abstract: Ontology is a specification of conceptualization. This paper
introduces an environment to develop semantic web applications. This
environment integrates a lot of tools such as an editing capability, logic
reasoner and semantic search engine. Design and implementation of a
generalized ontology editor is presented in this paper through which the user
may create, edit, validate, open, search (local and global), or visualize an
ontology or an instance file. The user may edit an instance to be stored
either in a RDF/XML file, OWL/XML, xml knowledge base or other formats. The
user may present the ontology hierarchy and the knowledge base in a tabular
form. The environment provides an interface through which the user may consult
the knowledge base by SQL-like statements. It also allows the user to map
ontology to another. It also introduces the virtualization concept providing a
mechanism to categorize the ontology instances based on given ontology
features. It also provides logic reasoner so we may check the truth of an
instance against a specific knowledge base. A semantic search engine is also
available either locally or globally.
Keywords: Ontology Editor, RDF, OWL, XML, Semantic Web, SPARQL, Jena
Issue
10, Volume 10, October 2011
Title of the Paper: A
Complete Path Representation Method with a Modified Inverted Index for
Efficient Retrieval of XML Documents
DOWNLOAD
FULL PDF
Authors:
Hsu-Kuang Chang, King-Chu Hung, I-Chang Jou
Abstract: Compiling documents in extensible markup language (XML) increasingly
requires access to data services which provide both rapid response and the
precise use of search engines. Efficient data service should be based on a
skillful representation that can support low complexity and high precision
search capabilities. In this paper, a novel complete path representation (CPR)
associated with a modified inverted index is presented for the provision of
efficient XML data services, where queries can be versatile in terms of
predicates. CPR can completely preserve hierarchical information, and the new
index is used to save semantic information. The CPR approach can provide
template-based indexing for fast data search. An experiment is also conducted
for the evaluation of the CPR approach.
Keywords: XML, DTD, Complete path representation (CPR), Structural summary
tree (SST), versatile query
Title of the Paper: Two-Dimensional
Clustering Algorithms for Image Segmentation
DOWNLOAD
FULL PDF
Authors:
Intan Aidha Yusoff, Nor Ashidi Mat Isa
Abstract: This paper introduces modified versions of the K-Means (KM) and
Moving K-Means (MKM) clustering algorithms, called the Two-Dimensional K-Means
(2D-KM) and Two-Dimensional Moving K-Means (2D-MKM) algorithms respectively.
The performances of these two proposed algorithms are compared with three of
the commonly used conventional clustering algorithms, namely K-Means (KM),
Fuzzy C-Means (FCM), and Moving K-Means (MKM). The new algorithms incorporate
the median value of considered pixel intensity with its neighboring pixel;
together with the pixel’s own intensity for the assigning process of the pixel
to the nearest cluster. From the observed qualitative and quantitative
results, it is proven that 2D-KM and 2D-MKM perform better than KM, FCM, and
MKM in terms of producing more homogeneous segmentation results, while taking
shorter time in executing the process as compared to FCM.
Keywords: Two-Dimensional K-Means (2D-KM), Two-Dimensional Moving K-Means
(2D-MKM), Image Segmentation, Clustering
Title of the Paper: Semantic
Classification of Human Behaviors in Video Surveillance Systems
DOWNLOAD
FULL PDF
Authors:
Alberto Amato, Vincenzo Di Lecce
Abstract: The semantic analysis of the human behavior in video streaming is
still an open issue for the computer vision research community, especially
when real-time analysis of complex scenes is concerned. The researchers’
community has achieved many progresses in this field. A popular class of
approaches has been devised to enhance the quality of the semantic analysis by
exploiting some background knowledge about scene and/or the human behavior,
thus narrowing the huge variety of possible behavioral patterns by focusing on
a specific narrow domain. Aim of this paper is to present an innovative method
for semantic analysis of human behavior in video surveillance systems.
Typically, this kind of systems are composed of a set of fixed cameras ach one
monitoring a fixed area. In the proposed methodology, the actions performed by
the human beings are described by means of symbol strings. For each camera a
grammar is defined to classify the strings of symbols describing the various
behaviors. This system proposes a generative approach to human behavior
description so it does not require a learning stage. Another advantage of this
approach consists in the simplicity of the scene and motion descriptions so
that the behavior analysis will have limited computational complexity due to
the intrinsic nature both of the representations and the related operations
used to manipulate them. This methodology has been used to implement a system
to classify human behaviors in a scene. The results are discussed in this
paper and they seem to be encouraging.
Keywords: Human behavior analysis, grammar based approach, semantic analysis
of video streaming, video surveillance systems, generative human behavior
description
Title of the Paper: Design
and Evaluation of Parallel , Scalable ,Curve Based Processor over Binary Field
DOWNLOAD
FULL PDF
Authors:
Rahila Bilal, M. Rajaram
Abstract: Implementing Public-Key cryptography systems is a challenge for most
application platforms when several factors have to be considered in selecting
the implementation platform. Elliptic Curve Cryptography is considered much
more suitable than other public-key algorithms. It uses lower power
consumption, has higher performance and can be implemented on small areas that
can be achieved by using ECC. In this work, scalable and parallel framework of
FPGA based ,Dual Field ( Prime and Binary Field) ECC processor is
explored.Using Altera –Quartus software tool, a 160 bit ECC processor core
with four 32 bit Arithmetic Units is evaluated on EP3SE50F780C3 .Scalar
multiplication is performed in 445 ?secs and occupies 9763 LUT’s.
Keywords: Public-Key cryptography ,ECC, Prime Field , Binary Field, FPGA,
Scalar Multiplication
Issue
11, Volume 10, November 2011
Special Issue: Applied
Soft Computing
Editor: Les Sztandera
Title of the Paper: Incremental
Radial Basis Function Computation for Neural Networks
DOWNLOAD
FULL PDF
Authors:
Vaclav Skala
Abstract: This paper present a novel approach for incremental computation of
Radial Basis Functions (RBF) for Fuzzy Systems and Neural Networks with
computational complexity of O(N2) is presented. This technique enables
efficient insertion of new data and removal of selected or invalid data. RBF
are used across many fields, including geometrical, image processing and
pattern recognition, medical applications, signal processing, speech
recognition, etc. The main prohibitive factor is the computational cost of the
RBF computation for larger data sets or if data set is changed and RBFs have
to be recomputed. The presented technique is applicable in general to fuzzy
systems as well offering a significant speed up due to lower computational
complexity of the presented approach. The Incremental RBF Computation enables
also fast RBF recomputation on “sliding window” data due to fast insert/remove
operations. This is a very significant factor especially in guided Neural
Networks case. Generally, interpolation based on RBF is very often used for
scattered scalar data interpolation in n-dimensional space. As there is no
explicit order in data sets, computations are quite time consuming that leads
to limitation of usability even for static data sets. Computational complexity
of RBF for N values is of O(N3) or O(k N2), k is a number of iterations if an
iterative method is used, which is prohibitive for many real applications. The
inverse matrix can also be computed by the Strassen algorithm based on matrix
block notation with O(N2.807) complexity. Even worst situation occurs when
interpolation has to be made over non-constant data sets, as the whole set of
equations for determining RBFs has to be recomputed when data set is changed.
This situation is typical for applications in which some values are becoming
invalid and new values are acquired.
Keywords: RBF, interpolation, incremental computation, neural networks, fuzzy
systems, algorithm, matrix inversion
Title of the Paper: Classification
Data Mining with Hybrid Fuzzy Logic Aggregation
DOWNLOAD
FULL PDF
Authors:
John F. Sanford, Les M. Sztandera
Abstract: Fuzzy logic is applied to the category discrimination problem
related to identification of mammary lesions as benign or malignant. Results
of other similar studies are reviewed. The current analysis expands the fuzzy
logic approach by using the normal distribution function as set membership
functions and using a genetic algorithm to optimize performance with the
training partition. The approach is applicable to problems having arbitrarily
large number of parameters. Two different data sets are examined. Data is
portioned into a training set and validation set and each set is segregated
into benign and malignant records. Values of mean and standard deviation are
initially computed from the associated attributes and are different for the
benign and malignant records. In one training method the standard deviations
are adjusted to minimize overall error. In a second method a bias adjusts the
importance of each membership function. Defuzzification is accomplished in
three ways: modified averaging and OR process; comparison of multiplied fuzzy
set values; and comparison of the multiplied squared set values. Results are
compared with results obtained through statistical logistic regression.
Keywords: Fuzzy data-analysis discrimination statistical analysis screening
Title of the Paper: Extracting
Information from Failure Equipment Notifications – Use of Fuzzy Sets to
Determine Optimal Inventory
DOWNLOAD
FULL PDF
Authors:
Les M. Sztandera
Abstract: This paper addresses the use of a data analysis approach to extract
information from a large number of failure equipment notifications. Based on
that, a fuzzy system, capable of learning and optimizing the knowledge from
historical evidence, is formed. Subsequently, its use as a guiding tool in
decision making processes at the strategic level (estimation of the number of
spare parts based on the warehouse location and type of failure), is outlined.
To highlight its advantages, the fuzzy sets approach for spare parts
allocation is compared with a probabilistic one.
Keywords: Computer repair parts; Optimal inventory; Data analysis; Fuzzy sets
Title of the Paper: Use
of a Genetic Algorithm – Neural Network Hybrid Algorithm in the Search for
High Efficiency Solid-State Phosphors
DOWNLOAD
FULL PDF
Authors:
Hugh Cartwright, Arsenij Leontjev
Abstract: Artificial Intelligence methods have been employed in the search for
solid-state phosphors with a high luminescence quantum yield. An Artificial
Neural Network was used to investigate how luminescence efficiency can be
linked to phosphor composition. The trained network was then coupled to a
Genetic Algorithm whose role was to locate the global optimum composition in
the search space. The compound Tb0.039Gd0.104Ce0.063Si0.401B0.393Oä (where ä
indicates the stoichiometrically-required amount of oxygen) is estimated to be
the optimum oxide composition that generates the highest green phosphor
luminescence for use in tricolour white LEDs, when excited by a 400 nm light
source.
Keywords: Genetic Algorithm, Artificial Neural Network, phosphor, LED, oxide,
hybrid algorithm
Issue
12, Volume 10, December 2011
Title of the Paper: Countermeasure
against the Jacobi symbol attack
DOWNLOAD
FULL PDF
Authors:
David Tinoco Varela
Abstract: Many physical attack types (Timing attacks, Power consumption
attacks, Fault attacks, etc.) have been developed against cryptosystems in the
recent years. Indeed there is a real necessity to eliminate the
vulnerabilities of the cryptosystems, like CRT-RSA or the Elliptic Curve
Cryptosystem, that make them susceptible to those attacks. In 2006 Boreale
described a new type of physical attack which is based in the Jacobi symbol
concept. In this paper a countermeasure against the Jacobi symbol attack is
presented and implemented in two modular exponentiation algorithms to make
them immune to such attack.
Keywords: Cryptography, Security, Modular exponentiation algorithms, Side
channel attacks, Jacobi symbol, Embedded devices
Title of the Paper: A
Novel Jamming-Aware Metric for MHWN Routing
DOWNLOAD
FULL PDF
Authors:
B. Q. Kan, J. H. Fan
Abstract: In recent years, framework based on multi-hop wireless network (MHWN
) mechanism has been paid more attentions. While the broadcast nature of
wireless medium in MHWN makes it extremely attractive and vulnerable to
malicious attacks. So how to ensure continuous network service becomes a
critical problem especially in jammed situations. Although some research has
been conducted on countering jamming attacks, few works consider jamming
dynamics. In this paper,we address the dynamical jamming problem in MHWN. In
our proposed solutions, interference avoidance mechanisms were well concerned
and a proactive multi-path routing mechanism based on novel jamming aware
metric was proposed. The proposed mechanisms need extra support in the form of
routing Interference Activity (IA) entries to build higher robust anti-jamming
paths in MHWN, while keeping the less reroute request times. Our evaluations
based on NS2 show that the proposed mechanisms can provide robust anti-jamming
paths for MHWN.
Keywords: multi-hop wireless network (MHWN), jamming, dynamics, multi-path
routing
Title of the Paper: Improving
the Generalization Capability of HIDMA with DeJong’s Gene Expression
DOWNLOAD
FULL PDF
Authors:
Jungan Chen, Feng Liang, Zhaoxi Fang
Abstract: In this work, an augmented hybrid immune detector maturation
algorithm applied in anomaly detection is proposed. In order to improve the
generalization capability, the DeJong’s gene expression is used. Experiment
results show the algorithm is more effective than other algorithms with binary
string expression.
Keywords: Artificial immune system, generalization capability, hybrid immune
detector
|
|
|