|
|
|
|
|
|
SELECT
ANY OF START HERE CHANNELS FOR A GUIDED TOUR
OF THE COLLEGE WEBSITE |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Two Weeks Faculty Development Program on “Emerging Trends in Computer Science & IT” |
16 May 2016 - 27 May 2016 |
|
With an intent to provide the necessary concepts, knowledge and skills on research and emerging trends in computer science for achieving the objective of fulfilling the needs of a society, a two-week faculty development program was organized by Bharati Vidyapeeth's Institute of Computer Applications and Management (BVICAM) from 16 May 2016 to 27 May 2016.
|
Ms. Ashu Khurana, Assistant professor, Department of Computer Science, along with Ms. Meenu Gupta, Assistant Professor Department of CS / IT attended the program. This workshop was attended by over 34 faculties from various institutes. Day wise brief is as follows. |
Day 1: 16 May 2016
|
The session began with a welcome address by Prof. M. N. Hoda, Director - Bharati Vidyapeeth's Institute of Computer Applications and Management (BVICAM). It was followed by his talk on objective of the faculty development program. He explained that the need of the hour is to change the way of thinking w.r.t research and how it would be helpful for our new generations. Thereafter Dr. Nasseb Singh Gill, Head, Department of Computer Science & Applications, MDU (Rohtak) discussed three mandatory fields to enhance the quality work in research i.e. People, Process and Technology. He said that if we are self - innovative and creative we can research something new and state of the art. In the end he cited the famous quote of Late Dr. A. P. J. Abdul Kalam Azad. |
“Manzil unhi ko milti hai, jinke sapno mein jaan hoti hai.
Pankh se kuch nahi hota, hauslon se udaan hoti hai”
|
The next session was conducted by Mr. Dheeraj Prakash, Lead Developer in Aricent Technologies on the User Centric Design. He said a design means to build something around a user. He explained the Why, What, Whom and How of design, by giving live examples. He explained key points to remember during the design of a website: Drive the eye to focus point and define the hierarchy. For designing a web site, first designers discover the requirement and then design according to requirements. |
The evening session was conducted by Ms. Suvira Srivastav, Associate Director, Springer on Quality Publication. She explained steps to write a quality paper: Use of simple English language and clubbing it with strong literature survey. First to write the methodology used in research and then Introduction and discussion and in the end decide the title of the paper. She also explained the reason for rejecting a paper such as: nothing being new in the research, out of scope etc. Then she discussed which is better; to publish a paper in conference or in a journal. If we need quick publication and review the process then it should be got published in conference else in a journal. But the advantage of journal over a conference is that it gives sufficient information for rejecting the paper so it means they can improve their research work. At the end she explained various tools used for selecting the journal or conference to publish a paper. |
Day 2: 17 May 2016 |
The first session of day 2 was conducted by Jyoti and Rinky, Ph.D students of JNU, Delhi on the basis of MATLAB. MATLAB (matrix laboratory) is a multi - paradigm numerical computing environment and fourth - generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages, including C, C++, Java, Fortran and Python. In this session they explained all the basic queries used in classification or clustering. All the participants practiced all the queries in lab. The session was very interactive and fruitful for them. |
The next session was conducted by Dr. R. K. Aggarwal, Head of Computer Science Department, JNU New Delhi on main components of machine learning i.e. Classification and clustering. Classification is the problem of identifying to which among a set of categories (sub - populations) a new observation belongs. It is decided on the basis of a training set of data containing observations (or instances) whose category membership is known. There are some areas where we are already used classification algorithm: fingerprint recognition, face recognition, detection of oil slicks, loan assessments, Share market prediction etc. After that he gave a brief explanation on perceptron with practical implementation in MATLAB. In machine learning, the perceptron is an algorithm for supervised learning of binary classifiers: functions that can decide whether an input (represented by a vector of numbers) belongs to one class or another. It is a type of linear classifier, i.e. a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector. Then he explained why we need multilayer perceptron. |
In the next session he explained the advantages of Support Vector Machine (SVM) over neural network or why it is better than neural network. In machine learning, support vector machines are supervised learning models with associated learning algorithms that analyze data used for classification and regression analysis. An SVM model is a representation of examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap, as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall on. |
In the end he explained the Clustering algorithm of machine learning. Cluster analysis or Clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense or another) to each other than to those in other groups (clusters). Cluster analysis itself is not one specific algorithm, but the general task to be solved. It can be achieved by various algorithms that differ significantly in their notion of what constitutes a cluster and how to efficiently find them. Popular notions of clusters include groups with small distances among the cluster members, dense areas of the data space, intervals or particular statistical distributions. |
Day 3: 18 May 2016 |
The session of day 3 was conducted by Dr. R. K. Aggarwal, Head of Computer Science Department, JNU, New Delhi on feature extraction and feature selection. First, he explained the K - nearest neighbor algorithm of classification algorithm. In pattern recognition, the k - Nearest Neighbors algorithm (or k - NN for short) is a non - parametric method used for classification and regression. In both cases, the input consists of the k - closest training examples in the feature space. The output depends on whether k - NN is used for classification or regression. The neighbors are taken from a set of objects for which the class (for k - NN classification) or the object property value (for k - NN regression) is known. This can be thought of as the training set for the algorithm, though no explicit training step is required. We implemented it in MATLAB on Fisheriris Data Model. Then he explained the feature extraction that in machine learning, pattern recognition and in image processing, feature extraction starts from an initial set of measured data and builds derived values (features) intended to be informative and non - redundant, facilitating the subsequent learning and generalization steps. Feature extraction is divided into two categories: wrapper method, Filter Method. After that he discussed about the feature selection method. Feature selection algorithms are important to recognition and classification systems because, if a feature space with a large dimension is used, the performance of the classifier will decrease with respect to execution time and to recognition rate. The execution time increases with the number of features because of the measurement cost. He explained the most effective feature selection techniques i.e. sequential floating search methods (SFSM). There are two main categories of floating search methods: Forward (SFFS) and Backward (SFBS). Basically, in the case of forward search (SFFS), the algorithm starts with a null feature set and, for each step, the best feature that satisfies some criterion function is included with the current feature set, i. e., one step of the sequential forward selection (SFS) is performed. The algorithm also verifies the possibility of improvement of the criterion if some feature is excluded. In this case, the worst feature (concerning the criterion) is eliminated from the set, that is, it is performed one step of sequential backward selection (SBS). Therefore, the SFFS proceeds dynamically increasing and decreasing the number of features until the desired is reached. |
Then he explained how to co-relate the features with their implementation in MATLAB. In the evening session he explained three more feature extraction technique: temporal technique, spectral technique and spatial technique. Then he discussed most popular research topic: Brain Computer Interfaces (BCI). A brain–computer interface (BCI), sometimes called a mind - machine interface (MMI), direct neural interface (DNI), or brain – machine interface (BMI), is a direct communication pathway between an enhanced or wired brain and an external device. BCIs are often directed at researching, mapping, assisting, augmenting, or repairing human cognitive or sensory - motor functions. BCIs focusing on motor neuro prosthetics aim to either restore movement in individuals with paralysis or provide devices to assist them, such as interfaces with computers or robot arms. |
Day 4: 19 May 2016 |
The session of day 4 was taken by Dr. Anup Girdhar, CEO, Sedulity Solutions and Technologies on cyber security and forensics. First he explained meaning of cyber security : cybersecurity or IT security, is the protection of information systems from theft or damage to the hardware, the software, and to the information residing on them, as well as from disruption or misdirection of the services they provide. To secure a computer system, it is important to understand the attacks that can be made against it, and these threats can typically be classified into one of the categories: Backdoors, direct access attack, tampering etc. Then he said that we should not use social sites: Facebook, what’s app, twitter etc. by this hacker or google collect the complete data of us. If we used mobile banking then also our information is accessed by the hackers. They can intercept the packets during the processing of information and modify it or send it for the further transmission. Then they can misuse this packet information. After that he explained about netstat tool (network statistics): it is a command - line network utility tool that displays network connections for the Transmission Control Protocol (both incoming and outgoing), routing tables, and a number of network interface (network interface controller or software-defined network interface) and network protocol statistics. It is available on Unix-like operating systems including OS X, Linux, Solaris, and BSD, and is available on Windows NT - based operating systems including Windows XP, Windows Vista, Windows 7, Windows 8 and Windows 10. It is used for finding problems in the network and to determine the amount of traffic on the network as a performance measurement. Cyber forensics, also called computer forensics or digital forensics, is the process of extracting information and data from computers to serve as digital evidence for civil purposes or, in many cases, to prove and legally prosecute cybercrime. |
The evening session was conducted by Dr. Ashutosh Dixit, Associate Professor (CE), YMCAUST, Faridabad on Information Retrieval and web crawler. He explained Information retrieval (IR) is the activity of obtaining information resources relevant to an information need from a collection of information resources. Searches can be based on or on full - text (or other content - based) indexing. Automated information retrieval systems are used to reduce what has been called "information overload". Many universities and public libraries use IR systems to provide access to books, journals and other documents. Web search engines are the most visible IR applications. Then he explained web crawler. A Web crawler is an Internet bot which systematically browses the World Wide Web, typically for the purpose of Web indexing. Web search engines and some other sites use Web crawling software to update their web content or indexes of others sites' web content. Web crawlers can copy all the pages they visit for later processing by a search engine which indexes the downloaded pages so the users can search much more efficiently. As the number of pages on the internet is extremely large, even the largest crawlers’ fall short of making a complete index. |
Day 5: 20 May 2016 |
The session of day 5 was conducted by Mr. Ajay Goel , VP, Aricent Technologies on changing Technological paradigm. The procedures and the nature of “technologies” are suggested to be broadly similar to those which characterize “science”. In particular, they appear to be research programs performing a similar role to scientific paradigms. The model tries to account for both continuous changes and discontinuities in technological innovation. Continuous changes are often related to progress along a technological trajectory defined by a technological paradigm, while discontinuities are associated with the emergence of a new paradigm. Then he gave some examples like autonomous car, smart meter, big data analytics. Autonomous car is a vehicle that is capable of sensing its environment and navigating without human input. Autonomous vehicles detect surroundings using radar, lidar, GPS, Odometry, and computer vision. Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage. Autonomous cars have control systems that are capable of analyzing sensory data to distinguish between different cars on the road, which is very useful in planning a path to the desired destination. |
A smart meter is an electronic device that records consumption of electric energy in intervals of an hour or less and communicates the information at least on a daily basis to the utility for monitoring and billing. Smart meters enable two-way communication between the meter and the central system. Unlike home energy monitors, smart meters can gather data for remote reporting. Such an advanced metering infrastructure (AMI) differs from traditional automatic meter reading (AMR) in that it enables two - way communications with the meter. |
Then next session was conducted by Mr. Himanshu Pandey, Senior Manager, Aricent Technologies and Dr. Naresh Chauhan, Associate Professor (CE), YMCAUST, Faridabad on Agile Methodologies. Agile Software Development is a set of principles for software development in which requirements and solutions evolve through collaboration between self - organizing, cross-functional teams. They discussed the principle of agile technology i.e. our highest priority is to satisfy the customer through early and continuous delivery of valuable software. After that they were given a brief overview of agile technology by explaining the key components of agile methodologies, Scrum, and challenges faced in traditional technologies. Agile development can be a very exciting and invigorating approach, although some projects suit agile more than others. The collaboration and visibility can provide a much richer and more rewarding experience for teams to develop great software products. |
Day 6: 23 May 2016 |
The session of day 6 was conducted by Shray Madan, Assistant programmer, EMC on cloud computing and virtual machine. He explained about cloud computing, Big data, Data storage and security. Cloud Computing, also known as on - demand computing, is a kind of Internet - based computing that provides shared processing resources and data to computers and other devices on demand. Then he discussed about RSA algorithm used for security. After that he told about the EMC academic alliance. EMC collaborates with colleges and universities worldwide to help prepare students for successful careers in a transforming IT industry. The EMC Academic Alliance program offers unique ‘open’ curriculum - based education on technology topics such as Cloud Computing, Big Data Analytics, Information Storage and Management, and Backup Recovery Systems and Architecture. All courseware and faculty training are offered at no cost to qualifying higher education institutions. The courses focus on technology concepts and principles applicable to any vendor environment, enabling students to develop highly marketable knowledge and skills required in today’s evolving IT industry. He also discussed about virtual machine. In computing, a virtual machine (VM) is an emulation of a particular computer system. Virtual machines operate based on the computer architecture and functions of a real or hypothetical computer, and their implementations may involve specialized hardware, software, or a combination of both. |
Then next session was conducted by Ms. Anju Mehra, Manager and Mr. Abhishek Asia Markets Reporter, Thomson Reuters on citation analysis and impact factor. First they discussed need of citation. Citation analysis is the examination of the frequency, patterns, and graphs of citations in articles and books. It uses citations in scholarly works to establish links to other works or other researchers. Citation analysis is one of the most widely used methods of Bibliometrics. |
Then last session was conducted by Mr. Neeraj Tayal, Ms. Hardit Kaur, Examiner of Patent Ministry of Commerce Govt. of India, and Dr. Dharmender Saini, Principal at Bharati Vidyapeeth College of Engineering, New Delhi on how to file a patent. They discussed the meaning of patent and steps to file a patent. He also explained what are not patentable. |
Day 7: 24 May 2016 |
The session of day 7 was conducted by Mr. Vipin Gupta, Sr. Consultant, U - Net Solutions on software defined networking. Software Defined Networking (SDN) is an approach to Computer Networking that allows network administrators to manage network services through abstraction of higher - level functionality. This is done by decoupling the system that makes decisions about where traffic is sent (the control plane) from the underlying systems that forward traffic to the selected destination (the data plane). He explained the Mininet. It is a network emulator, or perhaps more precisely a network emulation orchestration system. It runs a collection of end - hosts, switches, routers, and links on a single Linux kernel. It uses lightweight virtualization to make a single system look like a complete network, running the same kernel, system, and user code. Then he explained docker tools with example. Docker is an open - source project that automates the deployment of applications inside software containers. It provides an additional layer of abstraction and automation of operating - system - level virtualization on Linux. It uses the resource isolation features of the Linux kernel such as cgroups and kernel namespaces, and a union - capable file system. |
Then the next session was conducted by Mr. Himanshu Srivastava, Network Administrator at Intaglio solutions on packet tracert tool. First he gave a brief introduction on networking like IP Address and types of network. After that we created a network by using packet tracer tool. |
Day 8: 25 May 2016 |
The session of day 8 was conducted by Prof. Prerna Gaur, Associate Professor, NSIT on Advanced AI control Techniques. She gave a brief introduction on fuzzy logic, member functions, fuzzy sets and fuzzy logic controller. Fuzzy logic is a form of many-valued logic in which the truth values of variables may be any real number between 0 and 1, considered to be fuzzy. By contrast, in Boolean logic, the truth values of variables may only be 0 or 1, often called crisp values. Fuzzy logic has been employed to handle the concept of partial truth, where the truth value may range between completely true and completely false. After that participants implemented the fuzzy logic problem in MATLAB. |
The evening session was conducted by Prof. A. K. Sharma, Dean, BSAITM on object oriented technology and prolog predicate. First he explained basic concept of OOPS (Object Oriented Programming System): Polymorphism, Inheritance, and Encapsulation. Object is the any real thing which can exist in this world. Mainly it has three properties: State, Behavior, and Identity. After that he was given a brief introduction on Prolog Predicate. Prolog is a logic language, not an algorithmic language, and one therefore has to learn to think about programs in a somewhat different way. Then he explained some programs in prolog: union of list, searching an element from list, find the palindrome of input, delete last element from the list etc. |
Day 9: 26May 2016 |
The session of day 9 was taken by Prof. Manish Kumar, Associate Professor, BVICAM on Adhoc Networks. He gave a brief introduction on AdHoc network. A wireless adhoc network (WANET) is a decentralized type of wireless network. The network is adhoc because it does not rely on a preexisting infrastructure, such as routers in wired networks or access points in managed wireless networks. After that he discussed about Network Simulator 2 and Bonnmotion, NSG and TCP /IP protocol. Then participants had the practical session on NS2. |
Then next session was conducted by Mr. Prashant, Aricent Technologies on SDN, NFV and Open Stack software. He said Software Defined Networking (SDN) is an approach to computer networking that allows network administrators to manage network services through abstraction of higher - level functionality. In computer science, Network Functions Virtualization (NFV) is a network architecture concept that uses the technologies of IT virtualization to virtualize entire classes of network node functions into building blocks that may connect, or chain together, to create communication services. Then he discussed how open stack is related to SDN. |
The evening session was conducted by Dr. Shanker Goenka, CEO, WoW Factor, Gurgaon. He talked about Motivate Yourself and Your Thoughts “Whole Brain Thinking”. He gave some task related to brain. It was a very interesting session. He also told about how to read a person like a book. |
Day 10: 27 May 2016 |
The session of day 10 was conducted by Mr. Rakesh Maheshwari, Senior Director, Cyber Law and e Security Group, Dept. of Electronics and IT on cybercrime and information technology and emerging trends in IT ACT 2000. He discussed the need of cyber law as in today’s world everything is on Internet so as to take control over it when we need it. Then he told about some of problems which are faced and there is no law against them. After that he explained the objective of IT Act 2000. It was built to enable e-commerce in country. |
Then next session was conducted by Mr. B Venkat S. R. Swamy and Mr. Anil Pathak from Aricent Technologies on IoT and Smart cities. First Venkat told about the IoT (Internet of Things). The Internet of Things (IoT) is the network of physical objects - devices, vehicles, buildings and other items - embedded with electronics, software, sensors, and network connectivity that enables these objects to collect and exchange data. After that Mr. Anil gave a brief introduction about smart cities. He also gave example of My BUS project in Bhopal. Bhopal City Link Limited (BCLL) has been selected as one of the project cities for implementing the Global Environmental Facility (GEF) - Efficient and Sustainable City Bus Transport Service Project (ESCBSP). The project aims to encourage city bus operations in India’s urban transport systems. |
Last session was conducted by Mr. Gurjit, Aricent Technologies on Web Mining. Web Mining is the use of data mining techniques to automatically discover and extract information from Web documents and services. |
There are three general classes of information that can be discovered by web mining: web activity, from server logs and web browser activity tracking. |
Overall the Faculty Development Program was very beneficial for all the participants since new research topics were covered by all the speakers. Participants also gained hands on experience on many useful tools like MATLAB, Packet Tracer etc. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|