1Department of Computer Science K S W University , Bijapur ,Karnatka India
2Department of Computer Science and Engg , Banglor , Karnataka India
The wavelet transform has emerged as a cutting edge technology, within the field of image compression research. Wavelet theory and subband coding have created a surge of interest in wavelet based applications during the past decade. Image coding (or compression)is an important application that has benefited significantly from the wavelet theory. Lossless image coding using the Embedded Zerotree Wavelet (EZW) is the main focus of this work and also in the sequel to this work. The main purpose of this paper is to investigate the impact and quality of orthogonal wavelet filter for EZW. Meanwhile, we also look into the effect of the level of wavelet decomposition towards compression efficiency. The wavelet filters used are Haar and Dauhechies 4. The experimental results have been compared and qualitative analysis is done on the basis of time taken for compression & error after decompression for medical images.
1M C A Department Maharshi Markandeshwar University , Mullana , Haryana India
2chitkara Institute of Engineering and Technology ,Rajpura ,Punjab India
This paper highlights the comparison of image perception in human mind and in some electronic machine. It is explained how an image comes to human mind from visual system and the effect of increasing or decreasing intensity of source object on human our eyes. The same is then compared with Electronic Image Processing. Electronic Image Processing is based on human visual system so to understand the methods or algorithms used in image processing one must have basic understanding of human visual perception. It is also discussed how colors are identified by human as well by machine.
1 Regional Institute of Management and Technology (RIMT), Mandi Gobingarh ,Punjab ,India
2Chitkara Institute of Engineering and Technology ,Rajpura ,Punjab India
Intrusion detection and corrective measures in the networks is one of the challenges in the fast growing world of Cyber Crime. The network establishments are facing various types of threats on routine basis. To efficiently transmit information across a network, there is need of an improved and reliable architecture. The intrusion detection systems should be developed with utmost care to avoid any natural or intentional attempts. Moreover, the packet encryption algorithm should be developed in such a way so that cracker is not able to change even a single bit in the confidential data. This paper proposes an efficient algorithm for Packet Encryption as well as the standard to detect any kind of intercept attempt.
1,2DAVCET, Kanina, Mohindergarh , India
In this paper we illustrate total structure of swarm intelligence and their impacts on net-centric computing environments. The paper presents a comprehensive look on swarm applications and its potential to solve complex problems in related areas. The effects of emergent externalities of swarm behavior through its basic elements such as groups/clusters, individuals/agents and inner/outer communications are also studied to explain the role of swarming in improving the performance of net-centric systems. Self-organization, introduced as results of such collective and cooperating strategies. The paper also takes a look at the role of existing technologies and related challenges towards implementing real swarm systems.
1Associate Professor GNDEC ,LDH India
2Principal RIMT,IET Mandi Gobindgarh India
This paper presents a guideline engineering process resulting in a number of artefacts which lead to successful decision making in healthcare delivery. EHR Architecture is described in terms of reference model and archetype model describing the concepts of transaction, folder, data structures and data types. Existence of the components of Computer Interpretable clinical Guidelines (CiGs) in the EHR allows the point-of-care application to better promote the guideline with workflow and decision support. EHR also supports workflows by identifying the process of carrying out the actions. Instruction reference model is explained with a purpose to provide detailed information about instructions, the states through which instruction execution proceeds and how instruction connectors helps to specify workflow patterns.
1Department of Computer Science and Applications, Kurukshetra University, Kurukshetra India
2Department of CSE, DAVCET kanina Mohindergarh India
This paper presents recent work on Intradomain routing optimization using legacy protocols, and identifies important areas where further research is needed. The shortest and optimal routing is discussed in context of intradomain routing protocols. An alternative to having the distributed routing protocols adept to the prevailing traffic, the network operator or automated network management tools can modify the configuration of static parameters that derive the operation of routing protocols using traffic engineering concepts. The Objective of routing optimization is to balance the traffic load in the network with the goal to improve quality of service (QoS) and optimal utilization of available resources. The paper also provides the literature on traffic engineering and other algorithms that contributing to the optimization of Intradomain routing.
1Head of Department Maths & Computer Science , Sadhuvasvani Collgeg Bhopal India
2Head of Department Computer Application MDS UniversityAjmer Rajasthan India
3Lecturer ,Department of Computer Science Kendriya Vidhyalaya Jaipur India
4Lecturer, Department of Computer Science BIST Bhopal India
5Lecturer, Department of Computer Application SCOPE Bhopal MP India
An emerging trend in the Signal and Image Processing (SIP) community is the appearance of middleware and middleware standards that can be readily exploited for distributed computing applications by the SIP community. High performance computing and High Performance Embedded Computing (HPEC) applications will benefit significantly from highly efficient & portable computational middleware for signal & image processing. Open middleware standards such as VSIPL, MPI, CORBA, Data mining RMI, and Web Services (based on SOAP/XML), offer a unique opportunity for the rapid development of easily maintained HPEC codes that combine portability and flexibility across a number of applications This middleware infrastructure will support the rapid development and deployment of portable, efficient, SIP critical applications that will be of immediate benefit to many. The use of distributed computing technologies for problem solving has been around for many years. The early paradigm of distributed computing has been that of remote procedure calls (RPC). However, in recent years, this paradigm has shifted to the use of remote objects due to the acceptance of object oriented programming practices. Even today web services are built around the concept of messaging and frequently these messages take the form of request/response-type remote procedure calls on remote objects. The existing and emerging standards for performing distributed computing have resulted in several possible middleware choices for the SIP community. This paper focuses on three specific middleware standards for distributed computing, namely: the Common Object Request Broker Architecture (CORBA).
1,2,3CSE Department UIET Kurukshetra University Kurukshetra India
4ECE Department Kurukshetra Institute of Technology and Management Kurukshetra India
Software Engineering works as systematic, disciplined, quantifiable tool to the development, operation, and maintenance of software. In software development life cycle many models have been developed to evaluate and improve their capabilities. This paper proposed two new D-tables (decision tables i.e. D1, D2) which provide necessary guidelines to the developer/ organisation on decision making regarding selecting System Development methodology (SDM) by "Comparing traditional and object oriented SDM". This work is novel in the sense it comprises traditional and object oriented SDM models, analyzing with respect to pros. & cons. and guide the developer to select particular SDM approach. The required result may depend on organization's decision that how well they create software according to how they define and execute their processes. In this paper, we focus on different approaches of SDM and try to find out the suitability of different SDM approaches in software.
*Department of MCA ,Gandhi Institute of Computer Studies Gunupur, Rayagada Orissa India
**S.O.S in Statistics Pt ravishankar Shukla University Raipur Chhatisgarh India
This paper deals with the reliability analysis of wirerod mill system of an integrated steel plant. The system consists of two sets of crane and a charging grid. At some particular time one set of crane and charging grid comes into the operation whereas the other set remains in standby mode. The main function of this mill is to provide the support to the Blooming and Billet mill area. Using regenerative point technique, various characteristics of interest to the system designers as well as for operation managers have been evaluated which is helpful to improve the reliability of overall systems. Finally with the help of some graphs we have tried to show the effective result.
1M.E. SSPM's COET Amravati India
2AP SSPM's COET Amravati India
An image can be synthesized from a micrograph of various cell organelles by assigning a light intensity value to each cell organelle. The sensor signal is "digitized"— converted to an array of numerical values, each value representing the light intensity of a small area of the cell. Digital image processing is an area characterized by the need for extensive experimental work to establish the viability of proposed solutions to a given problem. Image processing modifies pictures to improve them, extract information, and change their structure. Image enhancement improves the quality (clarity) of images for human viewing. Removing blurring and noise, increasing contrast, and revealing details are examples of enhancement operations. For example, an image might be taken of an endothelial cell, which might be of low contrast and somewhat blurred. Reducing the noise and blurring effect and increasing the contrast range could enhance the image. The original image might have areas of very high and very low intensity, which mask details.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License