IOSR Journal of Computer Engineering (IOSR-JCE)

Volume 10- Issue 2

Paper Type : Research Paper
Title : Data Security Model Enhancement In Cloud Environment
Country : India
Authors : Navia Jose, Clara Kanmani A
: 10.9790/0661-01020106      logo

Abstract: Cloud computing is one of the most emerging technologies which plays an important role in the next generation architecture of IT Enterprise. It has been widely accepted due to its ability to reduce costs associated with computing while increasing flexibility and scalability for computer processes. During the past few years, cloud computing has grown from being a promising business idea to one of the fastest growing parts of the IT industry. In the cloud computing system, both application software and databases are moved to the large data centers, where the data should not be secure in the hands of providers. IT organizations have expressed concerns about the various security aspects that exist with the widespread implementation of cloud computing. These types of concerns originate from the fact that data is stored remotely from the customer's location. From the consumers' perspective, cloud computing security concerns, especially data security and privacy protection issues, remain the primary inhibitor for adoption of cloud computing services. This paper describes an enhancement for the already existing data security model in cloud environment. The proposed data security model provides user authentication and data protection. It also ensures fast recovery of data.

Keyword - AES Algorithm, Byzantine fault tolerance, Data Security Model, Distributed Denial of Service (DDoS)

[1] Peter Mell, and Tim Grance, The NIST Definition of CloudComputing,Version 15, 10-7-09
[2] Eman M.Mohamed and Hatem S. Abdelkader, Enhanced Data Security Model, The 8th International Conference on INFOrmatics and Systems (INFOS2012) - 14-16 May, Cloud and Mobile Computing Track, 2012
[3] Cloud computing security, http://en.wikipedia.org/wiki/Cloud_computing_security.
[4] ZhaoYong-Xia and Zhen Ge ,"MD5 Research," Second International Conference on Multimedia and Information Technology, 2010
[5] C. Almond, A Practical Guide to Cloud Computing Security,27 August 2009
[6] Aayush Chhabra and Srushti Mathur, Modified RSA Algorithm - A Secure Approach, 2011
[7] N. Mead, et ai, "Security quality requirements engineering (SQUARE) methodology," Carnegie Mellon Software Engineering
Institute.
[8] J. W.Rittinghouse and J. F.Ransome," Cloud Computing: Taylor and Francis Group," LLC, 2010.
[9] Lili Sun, Hua Wang, Xiaohui Tao,Yanchun Zhang and Jing Yang, "Privacy-Preserving Fine-Grained Access Control in Public Clouds, " 2011
[10] K. Cartrysse and J.C.A. van der Lubbe," The Advanced Encryption Standard: Rijndael," Supplement to the books"Basic methods
of cryptography" and"Basismethoden cryptografie", October 2004


Paper Type : Research Paper
Title : Service Based Content Sharing in the Environment of Mobile Ad-hoc Networks
Country : India
Authors : Hashmi Vallipalli, A.V.Praveen Krishna
: 10.9790/0661-01020712      logo

Abstract: The peer-to-peer network is the one of the traditional client server networking model. The file sharing on mobile devices is not easily achieved to the user for limited bandwidth and high cost. The irregular disconnection and IP address changes occur due to network migration. We holds the short range networking technologies as Bluetooth with no cost to the user and it is sufficiently fast to make file transfer very practical. A peer-to-peer model that permits efficient file sharing between mobile smartphones over a low-cost transport. Our paper results that peer-to-peer file transfer between todays mobile devices are practical. But the server limits must be applied on the transfers. These are unique to the mobile device environment. The upload-over-download ratio should be relatively low, due to higher current drain on transmit. Where the target file system is very slow then the larger file segments "Direct Memory Access" (DMA) mode can be utilized as different to "Program Input Output" (PIO) mode. The use of UDP for content sharing is more ideal than the use of OBEX. We will overcome some of the barriers to acceptance through our design approach. Peer discovery and content distribution occurs automatically without connection from the user. At last the transport is also implemented in the application layer and uses existing standard protocols without modification.

Keywords: Peer-to-peer network, Direct Memory Access, Program Input Output, mobile device environment, mobile smartphones, Bluetooth, network migration, mobile ad-hoc networks

[1] A. Oram, "Peer-to-Peer: Harnessing the Benefits of a Disruptive Technology". O'Reilly, 2001.

[2] K. Kant, "An analytic model for peer to peer file sharing networks," in IEEE International Conference on Communications, May 2003, pp. 1801 – 1805 vol. 3.

[3] A. Legout, G. Urvoy-Keller, and P. Michiardi, "Rarest First and Choke Algorithms Are Enough," in IMC'06, Rio de Janeiro, Brazil, 2006.

[4] Yang, X. et al, "Service capacity of peer to peer networks," in 23rd Joint Conference of IEEE Computer and Communications, 2004.

[5] Hu, T. H. et al, "Supporting mobile devices in Gnutella file sharing network with mobile agents," in 8th IEEE Int. Symposium on Computers and Communication, Sep. 2003, pp. 1035 – 1040 vol. 2.

[6] Dhurandher, S. K. et al, "A swarm intelligence-based p2p file sharing protocol using bee algorithm," in IEEE/ACS Int. Conf. on Computer Systems and Applications, 2009.

[7] Asadi, M. et al, "A scalable lookup service for p2p file sharing in manet," in Proc. of the 2007 Int. Conference on Wireless Comm. and Mobile Computing. New York, NY, USA: ACM, 2007.

[8] Huang, C.-M. et al, "A file discovery control scheme for P2P file sharing applications in wireless mobile environments," in Proc. of the 28th Australasian Conference on C.S., 2005.

[9] O. Ratsimor, D. Chakraborty, A. Joshi, T. Finin, and Y. Yesha, "Service discovery in agent-based pervasive computing environments," Mob. Netw. Appl., vol. 9, no. 6, pp. 679–692, 2004.

[10] G. P. Perrucci and F. Fitzek, "Measurements campaign for energy consumption on mobile phones," Aalborg University, Tech. Rep., 2009..



Paper Type : Research Paper
Title : Analytics of Data Science using Big Data
Country : India
Authors : Ch. Sai Krishna Manohar
: 10.9790/0661-01021921      logo

Abstract: This paper will cover the concepts of Adoption trends of Big Data, needs of BIG DATA, benefits of Big Data and finally the summary, conclusion. My analysis illustrate that the fast-growing and a key enabler for the social business. The paper also discuss about the storing of the data as a Big Data. It deals with the servers that the Big Data is going to maintain at the backend. The insights gained from the user generated online contents and collaboration with customers is critical for customers are critical success in the age of social media.

Keywords – Data warehousing as-a-service (DAAS), Info Sphere Big Insight, Sand Analytic Program.

[1]. http://www-01.ibm.com/software/data/infosphere/biginsights/
[2]. https://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&ved=0CD8QFjAC&url=http%3A%2F%2Fgslmug.files.wordpress.com%2F2012%2F02%2Fbig-data-presentation-made-at-the-utah-iseries-user-group-on-feb-82012.ppt&ei=jYI_UaSaJMrtrQe5sYGoCg&usg=AFQjCNE5YIz58c49BghBzJ4N8ZiZsd-hzQ&sig2=v4Qg-
[3]. www.acm.org
[4]. www.ieee.org
[5]. W.J. Book, Modelling design and control of flexible manipulator arms: A tutorial review, Proc. 29th IEEE Conf. on Decision and Control, San Francisco, CA, 1990, 500-506

Paper Type : Research Paper
Title : Efficient Record De-Duplication Identifying Using Febrl Framework
Country : India
Authors : K.Mala, Mr. S.Chinnadurai
: 10.9790/0661-01022227      logo

Abstract: Record linkage is the problem of identifying similar records across different data sources. The similarity between two records is defined based on domain-specific

similarity functions over several attributes. De-duplicating one data set or linking several data sets are increasingly important tasks in the data preparation steps of many data mining projects. The aim is to match all records relating to the same entity. Different measures have been used to characterize the quality and complexity of data linkage algorithms, and several new metrics have been proposed. An overview of the issues involved in measuring data linkage and de-duplication quality and complexity. A matching tree is used to overcome communication overhead and give matching decision as obtained using the conventional linkage technique. Developed new indexing techniques for scalable record linkage and de-duplication techniques into the febrl framework, as well as the investigation of learning techniques for efficient and accurate indexing.

Keywords - data cleaning; similarity matching; record linkage; data mining pre-processing; febrl.

[1] Baxter.R, Christen.P, and Churches.T, "A Comparison of Fast Blocking Methods for Record Linkage," Proc. ACM Workshop Data
Cleaning, Record Linkage and Object Consolidation (SIGKDD ‟03), pp. 25-27, 2003.
[2] Bilenko.M, Basu.S, and Sahami.M, "Adaptive Product Normalization: Using Online Learning for Record Linkage in Comparison
Shopping," Proc. IEEE Int‟l Conf. Data Mining (ICDM ‟05), pp. 58-65, 2005.
[3] Bilenko.M and Mooney.R.J, "On Evaluation and Training-Set Construction for Duplicate Detection," Proc. Workshop Data
Cleaning, Record Linkage and Object Consolidation (SIGKDD ‟03), pp. 7-12, 2003.
[4] Bilenko.M, .Kamath.B, and Mooney.R.J, "Adaptive Blocking: Learning to Scale up Record Linkage," Proc. Sixth Int‟l Conf. Data
Mining (ICDM ‟06), pp. 87-96, 2006.
[5] Clark.D.E, "Practical Introduction to Record Linkage for Injury Research," Injury Prevention, vol. 10, pp. 186 -191, 2004.
[6] Churches.T, Christen.P, .K Lim, and Zhu.J.X, "Preparation of Name and Address Data for Record Linkage Using Hidden Markov
Models," Biomed Central Medical Informatics and Decision Making, vol. 2, no. 9, 2002. [7]Christen.P and Goiser.K, "Quality and
Complexity Measures for Data Linkage and Deduplication," Quality Measures in Data Mining, ser. Studies in Computational
Intelligence, Guillet.F and Hamilton.H, eds., vol. 43, Springer, pp. 127-151, 2007.
[8] Christen.P, "Febrl: An Open Source Data Cleaning, Deduplication and Record Linkage System With a Graphical User Interface,"
Proc. 14th ACM SIGKDD Int‟l Conf. Knowledge Discovery and Data Mining (KDD ‟08), pp. 1065-1068, 2008.
[9] Christen.P, "Automatic Record Linkage Using Seeded Nearest Neighbour and Support Vector Machine Classification," Proc. 14th
ACM SIGKDD Int‟l Conf. Knowledge Discovery and Data Mining (KDD ‟08), pp. 151-159, 2008.
[10] Peter christen,"A Survey of indexing Techniques for Scalable Record Linkage And Deduplication", IEEE Transcations on
Knowledge And Data Engineering , vol 24,no.9.september 2012.


Paper Type : Research Paper
Title : Developing secure software using Aspect oriented programming
Country : India
Authors : Mohammad Khalid Pandit
: 10.9790/0661-01022834      logo

Abstract: Aspect oriented programming (AOP) is the programming paradigm that explicitly promotes the separation of crosscutting concerns. Some concerns crosscut the sizable application resulting in code scattering and tangling. These concerns are particularly severe in case of security related applications. The security of these applications can become compromised when the security related concerns are scattered and tangled with other concerns. The object oriented programming paradigm sometimes separate concerns in an intuitive manner by grouping them into objects. However, object oriented paradigm is only good at separating out concepts that easily map to the objects, but it is not good at separating concerns. Aspect oriented programming is the promising approach to improve the software development process and can tackle this problem by improving the modularity of crosscutting concerns.

Keywords— Programming languages, Aspect oriented programming, Security, Separation of concerns.

[1] L. Gong and G. Ellison, Inside Java 2 Platform Security: Architecture, API Design, and Implementation. Pearson Education, 2003. [2] L. Gong, M. Mueller, H. Prafullchandra, and R. Schemers, "Going Beyond the Sandbox: An Overview of the New Security Architecture in the Java Development Kit 1.2," Proc. USENIX Symp. Secure Systems, 1997.

[3] J. Viega, J. Bloch, and P. Chandra, "Applying Aspect-Oriented Programming to Security," Cutter IT Journal, vol. 14, no. 2, pp. 31-39, Feb. 2001.

[4] B. de Win, B. Vanhaute, and B. de Decker, "Security through Aspect-Oriented Programming," Proc. Advances in Network and Distributed Systems Security, pp. 125-138, 2001.

[5] John Viega, J.T.Bloch, and Pravir Chandra "Applying Aspect oriented programming to security".

[6] Bart De Win, Joosen, and Frank Piessens "Developing Secure Applications through Aspect oriented programming".

[7] Bart De Win, Bart Vanhaute, and Bart De Decker "How Aspect oriented programming can help to building secure software?"

[8] Li Gong, Marianne Mueller, Hemma Prafullchandra, and Roland Schemers "Going beyond the Sandbox: An Overview of the New Security Architecture in JavaTM Development Kit 1.2".

[9] Rodolfo Toledo, Angel Nu´n˜ ez, _ Eric Tanter "Aspectizing java access control".


Paper Type : Research Paper
Title : Analysis and Classification of Skin Lesions Using 3D Volume Reconstruction
Country : India
Authors : P.Sundari, P.Gokila
: 10.9790/0661-01023540      logo

Abstract: 3D volume reconstruction is used to identify the skin cancer at earlier stage. Through the person may be preventing from death. The subsurface information such as blood layer thickness and blood volume components can be identifying by multispectral transillumination images of the skin lesions. Lesions are simulated by using volume reconstruction, thus the result shows excellent volume with accuracy rate. Preliminary validation is also done by the small set of clinical lesions. The lesion severity can be categorized by an expert dermatologist using two features average blood layer thickness and ratio of blood volume to total lesion volume. The lesion can be classified into three types moderate, mild and severe class with high accuracy. However, these methods do not perform efficiently over a large set of images. An inverse volume reconstruction method is presented which uses a genetic algorithm optimization procedure with a novel population initialization routine and nudge operator based on the multispectral images to reconstruct the melanin and blood layer volume components. In our system, it is expected to have the ability to differentiate classes of lesions' severity based on multispectral trans-illumination and it can be perform over a large set of images. Therefore it has features like fast screening, tracking and detection of early skin cancers such as melanoma.

Keywords: Multispectral imaging, Volume estimation, Genetic algorithm, Volume reconstruction.

[1] Alessandro.B.D, and A. P. Dhawan, (2010) "Depth-dependent hemoglobin analysis from multispectral transillumination images," IEEE Transactions on Biomedical Engineering, vol. 57, no. 10, pp. 2568–2571.
[2] AlessandroB.D and A. P. Dhawan,(2011) "Voxel-based, parallel simulation of light in skin tissue for the reconstruction of subsurface skin lesion volumes," in Proceedings of the 33rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 8448–8451.
[3] Brian D'Alessandro, Atam P. Dhawan (Sep 2012) ''3D Volume Reconstruction of Skin Lesions for Melanin and Blood Volume Estimation and Lesion Severity Analysis''.
[4] Claridge.E, S. Cotton, P. Hall, and M. Moncrieff,(2003) "From colour to tissue histology: Physics-based interpretation of images of pigmented skin lesions," Medical Image Analysis, vol. 7, no. 4, pp. 489–502.
[5] Dhawan.A.P, B. D'Alessandro, and X. Fu, ( 2010 )"Optical imaging modalities for biomedical applications," IEEE Reviews in Biomedical Engineering, vol. 3, pp. 69–92.
[6] Friedman.R.J, D. S. Rigel, and A. W. Kopf(Aug 1999), "Early detection of malignant melanoma: The role of physician examination and self-examination of the skin," CA: A Cancer Journal for Clinicians, vol. 35, no. 3, pp. 130–151.
[7] Gambichler.G, P. Regeniter, F. Bechara, A. Orlikov, R. Vasa, G. Moussa, M. Stücker, P. Altmeyer, and K. Hoffmann,(2007) "Characterization of benign and malignant melanocytic skin lesions using optical coherence tomography in vivo," Journal of the American Academy of Dermatology, vol. 57, no. 4, pp. 629–637.
[8] Goodson. and D. Grossman,(2009) "Strategies for early melanoma detection: Approaches to the patient with nevi," Journal of the American Academy of Dermatology, vol. 60, no. 5, pp. 719–735.
[9] Kittler.H, H. Pehamberger, K. Wolff, and M. Binder, (2002) "Diagnostic accuracy of dermoscopy" The lancet oncology, vol. 3, no. 3, pp. 159–165.

Paper Type : Research Paper
Title : Searching and Analyzing Qualitative Data on Personal Computer
Country : India
Authors : Mohit Bhansali, Praveen Kumar
: 10.9790/0661-01024145      logo

Abstract: The number of files stored on our personal computer (PC) is increasing very quickly and locating information in this environment is difficult. In this paper we present a set building blocks for implementing a system which is composed of four modules i.e. indexing mechanism, analyzing text, index storing and searching mechanism. The implementation of these four modules are described in details and additionally, we provide implementation of user interface, and how they interact with each other.

Keywords – Desktop Search, Information Retrieval, indexing, searching, personal computer (PC). 3

[1] Karger, David R., et al. "Haystack: A customizable general-purpose information management tool for end users of semistructured data." Proc. of the Conference of Innovative Data Systems Research, 2005.
[2] Sinha, Vineet, and David R. Karger. "Magnet: Supporting navigation in semistructured data environments." Proceedings of the 2005 ACM SIGMOD international conference on Management of data. ACM, 2005.
[3] Sauermann, Leo, and Sven Schwarz. "Gnowsis adapter framework: Treating structured data sources as virtual rdf graphs." The Semantic Web–ISWC 2005(2005): 1016-1028.
[4] Aleman-Meza, Boanerges, et al. "Context-aware semantic association ranking." Proceedings of Semantic Web and Database Workshop. Vol. 3. 2003.
[5] Otis Gospodnetic,Erik Hatcher. Lucene in Action. Manning Publications, 2006.
[6] Li, Shengdong, et al. "Study on efficiency of full-text retrieval based on lucene."Information Engineering and Computer Science, 2009. ICIECS 2009. International Conference on. IEEE 2009.
[7] Tian, Wen, Zhou Ya, and Huang Guimin. "Research and implementation of a desktop full-text search system based on Hyper Estraier." Intelligent Computing and Integrated Systems (ICISS), 2010 International Conference on. IEEE, 2010.
[8] Lucene: http://lucene.apache.org/.
[9] Gospodnetic, Otis. "Parsing, indexing, and searching XML with Digester and Lucene." Journal of IBM Developer Works (2003).
[10] Wei Zhao, "The design and research of Literary retrieval system based on Lucene," Electronic and Mechanical Engineering and Information Technology (EMEIT), 2011 International Conference on , vol.8, no., pp.4146,4148, 12-14 Aug. 2011.
[11] Hristidis, Vagelis, Heasoo Hwang, and Yannis Papakonstantinou. "Authority-based keyword search in databases." ACM Transactions on Database Systems (TODS) 33.1 (2008): 1.


Paper Type : Research Paper
Title : SQl Injection Protector for Authentication in Distributed Applications
Country : India
Authors : Mrs.R.Velvizhi,
: 10.9790/0661-01024649      logo

Abstract:In today's information age, information sharing and transfer has increased exponentially. The treat of an intruder accessing secret information has been an ever existing concern for the data communication experts. Cryptography and steganography are the most widely used techniques to overcome this threat .Web application has been developed with very rapid progress. Security vulnerabilities due to amendment of intruders and hackers become predominant in the current trends. The work of this paper proposes a technique using hash value of user name and password,to improve the authentication process. We have built an prototype,SQL injection protector for authentication (SQLIPA).In addition to the proposed hash technique we are trying to conceal the logged in information using chaffing and winnowing technique in a stenographical image. These images will be stored as file stream by an encrypted layer in the backend to hide the tuples used for storage in a distributed environment. Validating the XML content with typed dataset will scrutinize the input data further associated with XSD filtration.

Key Words: Security, Cryptography, Stenography, SQL Injection Protector for Authentication, Hash Technique.

[1] Piyush Marwaha,Paresh Marwaha," Visual Cryptographic Steganography In Images" 2010 Second International conference on Computing, Communication and Networking Technologies.Infosys Technologies Limited,

[2] Taekyoung Kwon, Member, IEEE, and Hyeonjoon Moon, Member, IEEE," Biometric Authentication for Border Control Applications", IEEE transactions on knowledge and data engineering, vol.20,no. 8,august 2008.

[3] Yihun Alemu, Jong-bin Koh, Muhammed Ikram,Dong-Kyoo Kim, "Image Retrieval in Multimedia Databases: A Survey",Department of Computer Engineering, Ajou University South Korea, Suwon.

[4] Diallo Abdoulaye Kindy and Al-Sakib Khan Pathan ," A Survey On SQL Injection: Vulnerabilities, Attacks, and Prevention Techniques", Department of Computer Science, International Islamic University Malaysia, Malaysia 2011 IEEE 15th International Symposium on Consumer Electronics

[5] Vijay Varadharajan, Udaya Tupakula,"Security Techniques for Zero Day Attacks" Information & Networked Systems Security Research Faculty of Science, Macquarie University, Sydney, Australia

[6] IndiaDebashish Jena, "A Novel Visual Cryptography Scheme", IEEE International Conference on Advanced Computer Control, 2009.

[7] K. Kemalis, and T. Tzouramanis (2008). SQL-IDS: A Specification- based Approach for SQLinjection Detection. SAC'08. Fortaleza, Ceará, Brazil, ACM: pp. 2153 2158.


Paper Type : Research Paper
Title : Section Code Task Model for Heterogenous Processor In Real Time System
Country : India
Authors : Jestin Rajamony, S. Gladson Oliver
: 10.9790/0661-01025059      logo

Abstract: Even thou there are many schedulers available, almost every scheduler are not giving the maximum performance when used in real time system when there is an overload. To overcome this, an innovative approach has been used in this paper which gives the maximum utility when compared to the other related schedulers available. The program is split and grouped together in the initial stage, and a section code is tested for the entire cluster of code and using the feedback loop the miss ratio is identified. The major work of the paper is selecting the desired algorithm and fixing to the desired core in the heterogeneous multicore processor. So if a deadline is not met by any one of the core, then the scheduler submits it to the next core of high or low end speed using the feedback control framework This gives a wider spectrum when double check is done for each code before it is used in the real time system avoiding the miss ratio and obtaining the maximum utility. The paper uses operating system scheduler algorithm over the control system methodology for its design and scheduling.

Key-Words: - Section code, loop cloud, MAPS.

[1] R.Kumar et al, "Single ias hetrogenous multicore architecture for multithreaded workload performance", ISCA, Page 64, 2004.
[2] J. R. Haritsa, M. Livny and M. J. Carey, "Earliest Deadline Scheduling for Real-Time Database Systems", IEEE RTSS, 1991.
[3] W.Zhao, K. Ramamritham and J.A.Stankovic, "Pre-emptive Scheduling Under Time and Resource Constraints", IEEE Transactions
on Computers, No.36 (8), 1987.
[4] C.L. Liu and J.W. Layland, "Scheduling Algorithms for Multiprogramming in a Hard Real-Time Environment", Journal of ACM,
Vol. 20, No. 1, pp. 46-61, 1973.
[5] Pawet Piqtek, Wojciech Greja, "Speed Analysis of a Digital Controller In the Critical Application", Journal of Automation, Mobile
Robotics & Intelligent Systems, Vol 3, 2009.
[6] A.Georges et al, "Method-level phase behavior in java workloads", OOPSLA, 2004.
[7] P.Marti and M.Vekasco, "Toward flexible scheduling real time control tasks: Reviewing basic control models", Proc. Hybrid
systems, Computation and control, LNCS.
[8] E.Johannesson, T.Henningson and A.Cervin, "Sporadic control of first order linear stochastic systems," ICHS, Computation and
control, 2007.
[9] M.Rabi, "Packet based inference and control," PhD thesis, Institute for systems research, University of Maryland, 2006.
[10] Martin Ohlin,Dan Henriksson, Anton Cervin, Truetime, Department of Automatic Control, Lund University, 2007.


Paper Type : Research Paper
Title : Optimization of Mining Association Rule from XML Documents
Country : India
Authors : P.Jothi lakshmi, D.Sasikala
: 10.9790/0661-01026064      logo

Abstract: Association rule mining finds the interesting correlation among a large set of data items. With a large amount of data being collected and stored continuously in databases, it has become mandatory to mine interesting relationship between the attributes. Semi-structured data refers to set of data with some implicit structure but not enough of a regular. Mining association rule from semi-structured data is confronted with more challenges due to the inherent flexibilities of it in both structure and semantics. The eXtensible Markup Language (XML) is a major standard for storing and exchanging information. The index based scheme to index all the elements in a group of XML document. This table is used to check the ancestor-descendant relation between an item and transaction efficiently and a relational. Apriori algorithm is used to mine association rules from the XML documents with no guidance of the user. On the basis of the association rule mining and Apriori algorithm, this paper optimizes the result generated by Apriori algorithm using Ant Colony Optimization (ACO) algorithm by choosing confidence value as pheromone update value. ACO is a meta-heuristic inspired by the foraging behaviour of ant colonies and it was introduced by Dorigo.

Keywords - Ancestor-descendant relation, Ant Colony Optimization (ACO), Association rule mining, Index Table, XML.

[1] Al-Ani Ahmed, "Feature Subset Selection Using Ant Colony Optimization", International Journal of Computational Intelligence (IJCI), vol. 2, No. 1, pp. 53-58.
[2] Nada M. A. AL-salami, Saad Ghaleb Yaseen, "Ant Colony Optimization", International Journal of Computer Science and Network Security (IJCSNS), vol.8 No.6, pp 351-357, June,( 2008).
[3] Yiwu Xie, Yutong Li, Chunli Wang, MingyuLu, "The Optimization and Improvement of the Apriori Algorithm", International Symposium on Intelligent Information Technology Application Workshops, IEEE, (2008).
[4] Yan-hua Wang, Xia Feng, "The Optimization of Apriori Algorithm Based on Directed Network", Third International Symposium on Intelligent Information Technology Application, IEEE, (2009).
Books:
[5] M Dorigo, T Stutzle, Ant Colony Optimization(The MIT press Cambridge, MA).
Proceedings Papers:
[6] Rakesh Agrawal, Ramakrishnan, Fast Algorithms for Mining Association Rules, Proceedings of the 20th VLDB Conference Santiago, (1994).
[7] Marco Dorigo, Christian Blum,An Efficient Algorithm for Mining Association Rules in Large Databases, Proceedings of the 21st VLDB Conference, Switzerland, (1995).
[8] Charu C. Aggarwal, Philip S. Yu, Online Generation of Association Rules, Proceedings of 14th International Conference, IEEE, February,(1998).
[9] Gao, Shao-jun Li,A method of Improvement and Optimization on Association Rule Apriori algorithm, Proceedings of the 6th conference on intelligent control and automation, (2006).


Paper Type : Research Paper
Title : Quality Assurance Standards and Survey of IT Industries
Country : Pakistan
Authors : Maria Khalid, Isma Yaqoob, Farhana Shahid, Rohma Nayab, Mehreen Sirshar
: 10.9790/0661-01026574      logo

Abstract: Quality of the product depends on customer satisfaction which can be achieved by applying standards. Different standards have been reported in this literature that assist in improving Quality of products. The intent of each standard is to assure Quality in the processes and achieve a standard product. This research is the comprehensive survey of Standards followed by IT industries for assuring Quality in their products. After analysis of Standards, it has been explored that the techniques after adopting these Standards shows improvement in the systems. The IT businesses follow ISO (International Organization for Standardization),CMMI (Capability Maturity Model Integration), PMI (Project Management Institute), ASME (American Society Mechanical Engineers), ANSI (American National Standards Institute), IEC (Internationa Electr technical Commission), DRR(Digitally Reconstructed Radiography), ASQ (American Society of Quality) Quality Standards and some of them also use Quality evaluation tools for Quality Assurance. This research also assesses the improvements of IT businesses after applying Quality Standards.

Keywords: ANSI (American National Standards Institute), ASME (American Society Mechanical Engineers), ASQ (American Society of Quality) CMMI (Capability Maturity Model Integration), DRR (Digitally Reconstructed Radiography), follow ISO (International Organization for Standardization).

[1]. Meyer,B., Object-Oriented Software Construction, Prentice Hall PTR, 2000, pp.4-20
[2]. McCall,J.A., Richards,P. K., & Walters, G. F., Factors in Software Quality, Vols.1, 2 and 3, National Technical Information Service, 1977
[3]. Pfleeger,S. L., Software Engineering: Theory and Practice, Prentice Hall, 2005
[4]. F.J. Domínguez-Mayo, M.J.Escalona, M.Mejías, M.Ross and G.Staples. "Quality Evaluation for Model-Driven Web Engineering Methodologies", Information and Software Technology, Vol. 54, Nov 2012. pp.1265-1282.
[5]. M.Sirshar. "Evaluation of Quality Assurance Factors in Agile Methodologies", International Journal of Advanced Computer Science, Vol. 2, Feb2012, pp. 73-78.
[6]. A.Javed , M.Maqsood, K.A.Qazi , K.A.Shah. "How To Improve Software QualityAssurance in Developing Countries", An International Journal (ACIJ ), Vol. 3, Mar 2012.
[7]. J.A.Sani, N.Othman. "Quality Standard and Specification for Soft-Scape Construction in Malaysia". Malaysia, 2012.
[8]. Y.T.Sung, K.E.Chang and W.C.Yu. "Evaluating the Reliability and Impact of a Quality Assurance System for E-learning Courseware", Computers & Education, Vol. 57,, Sept 2011, pp. 1615-1627.
[9]. A.Iftikhar, S.M.Ali. "Software Quality Assurance a Study Based on Pakistan‟s Software Industry", Vol. 1, 2011, pp. 65-73.
[10]. K.K.F.Yuen , H.C.W. Lau. "A Fuzzy Group Analytical Hierarchy Process Approach for Software Quality Assurance Management: Fuzzy logarithmic Least Squares Method". Expert Systems with Applications, vol. 38, Jan.2011, pp. 10292-10302.


Paper Type : Research Paper
Title : Developing Web Browser-Jan
Country : India
Authors : Rajeswari.M, Brindha.S, Sindhuja.N, Jaganathan.S
: 10.9790/0661-01027578      logo

Abstract: A web browser is a software application for retrieving, presenting and traversing information resources. It can also be defined as an application software or program designed to enable users to access, retrieve and view documents and other resources on the Internet. The major web browsers are Chrome, Firefox, Internet Explorer, Opera, and Safari. JAN web browser mainly focused for users who need efficient resource and save their time. Our web browser is based on the concept of data mining where the data is extracted from the huge resource called internet. JAN browser consists of additional feature like virtual keyboard which is used for security purpose where the password cannot be hacked and also our browser support HTTP Secure and offer quick and easy ways to delete the web cache, cookies, and browsing history, it also shown the current time and date using time and date the user can set the alarm for their work. This browser also has a speed dialing concept where the speed dialing is assigned in network wise whereas in existing browser this type of concept is not found. And also this browser will give the user to save their document in their required format. So our project will result in the satisfaction of users for what they expect

Keywords – Browser, Speed dialing, Secure, virtual keyboard

[1] Proposal on a Secure Communication Service Element (SCSE) in the OSI Application Layer, Kouji Nakao and Kenji SuzukiIEEE Journal on selected areas in communications, vol 7, no.4, may 1989.
[2] Performance Models for Automatic Evaluation of Virtual Scanning Keyboards; Samit Bhattacharya, Debasis Samanta and Anupan BasuIEEE transactions on neural systems and rehabilitation engineering, Vol 16, October 2008
[3] A study of the Energy Consumption Characteristics of Crptographic Algorithms and Security Protocols, Nachiketh R.Pollalpally,Anand RagunathanIEEE transactions on mobile consumption, vol 5, no 2, February 2006.


Paper Type : Research Paper
Title : College Collaboration Portal with Training and Placement
Country : India
Authors : ShilpaHadkar,SnehalBaing,TruptiHarer,SonamWankhade, K.T.V.Reddy
: 10.9790/0661-01027981      logo

Abstract:Training and placement is the crucial part of any educational institute in which most of the work is being done manually .The aim of or project is Automation of Training and Placement until that will include minimum manual work and maximum optimization, abstraction security. This is web application which will help student as well as the administration authority to carry out each activity in this department.

Keywords-Automation, Optimization, Security, Task, activity, interface, Training information, software and placement.

[1] R. Ostrovsky, "Software protection and simulations on oblivious rams," Ph.D dissertation, Massachusetts Institute of Technology,1992. [2] V. Levenshtein, "Binary codes capable of correcting spurious insertions and deletions of ones," Problems of Information Transmission, vol. 1,no.1, pp. 8–17, 1965 [3] Tech terms: what every telecommunications and digital media person should know By Jeff Rutenbeck, Jeffrey Blaine
[4] Microsoft Silverlight 4 Business Application Development: Beginners Guide By Cameron Albert, Frank LaVigne, Packt Publishing
[5] Beginning ASP.NET 3.5: In C# and VB (Programmer to Programmer) By ImarSpaanjaars, Wrox [6] SQL Programming Joes 2 Pros: Programming & Development for Microsoft SQL Server 2008 By Rick A. Morelan
[7] Kimball's Toolkit Classics: Data Warehouse Toolkit, 2ndEd; Data Warehouse Lifecycle Toolkit, 2nd Ed; Data Warehouse ETL Toolkit
[8] The Microsoft Data Warehouse Toolkit, 2nd Edition: With SQL Server 2008 R2 and the Microsoft Business Intelligence Toolset


Paper Type : Research Paper
Title : A Survey on Identification of Closed Frequent Item Sets Using Intersecting Algorithm for Transaction in Data Mining
Country : India
Authors : Veenita Gupta , Neeraj Kumar , Praveen Kumar
: 10.9790/0661-01028286      logo

Abstract: Most known frequent item set mining approaches enumerate candidate item sets, determine their support, and prune candidates that fail to reach the user-specified minimum support. Apart from this scheme we can use intersection approach for identifying frequent item set. But the intersection approach of transaction is the less researched area and need attention and improvement to be applied. To the best of our knowledge, there are only two basic algorithms: a cumulative scheme, which is based on a repository with which new transactions are intersected, and the Carpenter algorithm, which enumerates and intersects candidate transaction sets. These approaches yield the set of so-called closed frequent item sets, since any such item set can be represented as the intersection of some subset of the given transactions.As the transactional database increases, the size of prefix tree also grows which make it difficult to handle. An improvement has been suggested to reduce the total number of branches in the prefix tree leading to reduction in its size.

Keywords : algorithm , closed item set , frequent item set mining, intersection ,transaction.

[1] T. Uno, M. Kiyomi, and H. Arimura. Lcm ver. 2: Efficient mining algorithms for frequent/closed/maximal itemsets. In Proc. Workshop Frequent Item Set Mining Implementations (FIMI 2004, Brighton, UK), Aachen, Germany, 2004. CEUR Workshop Proceedings 126.

[2] G. Grahne and J. Zhu. Reducing the main memoryconsumptions of FPmax* and FPclose. In Proc. Workshop Frequent Item Set Mining Implementations (FIMI 2004, Brighton, UK), Aachen, Germany, 2004. CEUR Workshop Proceedings 126.

[3 ] Calders T, Garboni C, Goethals B (2010) Efficient pattern mining of uncertain data with sampling. In: Proceedings of the 14th Pacific-Asia conference on knowledge discovery and data mining (PAKDD 2010, Hyderabad, India), vol I. Springer, Berlin, pp 480–487 [4] B. Goethals and M. Zaki, editors. Proc. Workshop Frequent Item Set Mining Implementations (FIMI 2004, Brighton, UK), Aachen, Germany, 2004. CEUR Workshop Proceedings 126.

[5] C. Borgelt and X. Wang. SaM: A split and merge algorithm for fuzzy frequent item set mining. In Proc. 13th Int. Fuzzy Systems Association World Congress and 6th Conf. of the European Society for Fuzzy Logic and Technology (IFSA/EUSFLAT'09, Lisbon, Portugal), Lisbon, Portugal, 2009. IFSA/EUSFLAT Organization Committee.

[6] F. Pan, G. Cong, A. Tung, J. Yang, and M. Zaki. Carpenter: Finding closed patterns in long biological datasets. In Proc. 9th ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining (KDD 2003, Washington, DC), pages 637{642, New York, NY, USA, 2003. ACM Press.

[7] F. Pan, A. Tung, G. Cong, and X. Xu. Cobbler: Combining column and row enumeration for closed pattern discovery. In Proc. 16th Int. Conf. on Scienti_c and Statistical Database Management (SSDBM 2004, Santori Island, Greece), page 21_, Piscataway, NJ, USA, 2004. IEEE Press.

[8] G. Cong, K.-I. Tan, A. Tung, and F. Pan. Mining frequent closed patterns in microarray data. In Proc. 4th IEEE International Conference on Data Mining (ICDM 2004, Brighton, UK), pages 363{366, Piscataway, NJ, USA, 2004. IEEE Press.


Paper Type : Research Paper
Title : Enhancing Software Quality Using Agile Techniques
Country : Bangladesh
Authors : Amran Hossain, Dr. Md. Abul Kashem, Sahelee Sultana
: 10.9790/0661-01028793      logo

Abstract: Agile techniques may produce software faster as well as enhance software quality so that they fulfill quality requirements of the product. In this paper we have considered some quality factors and we have shown how agile techniques enhance software quality. We have presented an agile development life cycle that showing its software quality support processes. Finally we have shown summarization of software quality evaluation with agile techniques that enhance software quality.

Keywords- Agile methods, Architectural spike, Software Quality, Software Quality assurance, System metaphor.

[1] Ming Huo, June Verner, Liming Zhu, Muhammad Ali Babar, Software Quality and Agile Methods, Proceedings of the 28th Annual International Computer Software and Applications Conference (COMPSAC'04)
[2] What Is Software Quality? , http://www.ocoudert.com/blog/2011/04/09/what-is-software-quality
[3] G. Gordon Schulmeyer, The handbook of software quality Assurance, Prentice Hall, 1998. [4] Daniel Galin, Software Quality Assurance, First published 2004. [5] Osama Suhaib and Khalid khan, The Role of Software Quality in Agile Software Development Methodologies,Technology Forces (Technol. forces): Journal of Engineering and Sciences January-June 2010. [6] Pete McBreen. Mcbreen, Quality Assurance and Testing in Agile Projects, Consulting 2003. [7] K. Beck: Extreme programming explained: Embrace change, Second edition Kent Beck Publisher.2000.
http://www.mcbreen.ab.ca/talks/CAMUG.pdf (last accessed January 2013)
[8] Explore Extreme Programming: The system Metaphor. http://xp123.com/articles/the-system-metaphor/ (Last accessed 26th January 2013)
[9] System metaphor http://c2.com/xp/SystemMetaphor.html (Last accessed 26th January 2013)
[10] The Architecture Owner Role: How Architects fit in on Agile Teams, Scotttw.amble http://www.agilemodeling.com/essays/architectureOwner.htm



IOSR Journal Publish Online and Print Version Both