Quizzes are the best way to verify ones own knowledge and also lean about the unknown topics. Hence at CEV, we organized a quiz competition on 16th March 2016. The quiz was conducted on various topics like general knowledge, movies, books, latest technology, general aptitude etc.
We would like to thank all the participants for taking part in the quiz.
In a our college, every student chapters organize the first program in which they inform the fresher’s about their student chapter and the activities done by them throughout the year. Similarly the Orientation program of Cutting Edge Visionaries (CEV) was organized on 3rd September 2015.
In this Orientation program, students were given information about the college, life at college, facilities provided by the college, proper methods of utilizing the provided facilities, how can they be productive in the spare time etc. They were provided with various information, from which projects they can do to which TV shows and movies they should watch. All the information was provided to them.
The main aim of the program was to make the students productive, get involved in creative activities whether it may be technical or non technical. The sole idea of CEV is that people learn something new, something different and share that with the people in the future.
All those who were not able to attend, may contact your friends those who attended and take the given handouts.
Google projects tend to have a forward focus. The company pours resources into researching everything. In past the company had come forward with many projects like Google Maps, Google Glasses, Google Goggles, driverless car etc. which have changed the life and made it easy.
Similarly some of the new upcoming popular projects that Google is putting resources into development are:
The smartphone is one of the most empowering and intimate objects in our lives. Yet most of us have little to say in how the device is made, what it does, and how it looks. And 5 billion of us don’t have one. What if you could make a thoughtful choice about exactly what your phone does, and use it as a creative canvas to tell your own story?
With this idea, Google came forward with the Project ARA. Project ARA is the codename of an initiative that aims to develop an open hardware platform for creating highly modular smartphones. The platform will include a structural frame or endoskeleton that holds the smartphone modules of the owner’s choice such as a display, camera, speaker, processor or an extra battery.
The project was originally headed by the Advanced Technologies and Projects team within Motorola Mobility while it was a subsidiary of Google and was working in collaboratively with Phonebloks. Although Google had sold the Motorola to Lenovo, it retained the project team.
ARA smartphones are built using modules inserted into endoskeleton frames which will be the only component in the ARA smartphones made by Google. The modules that are attached to the frame by electro-permanent magnets can provide common features such as camera, speakers, display, etc. as well as some specialized features like medical devices, printers, projectors, night vision sensors or game controller buttons.
Google wants Project ARA to lower the entry barrier for phone hardware manufacturers so there could be “hundreds of thousands of developers” instead of the current handful of big manufacturers. This would be similar to how the Google Play Store is structured. Lowering the barrier for entry allows many more people to develop modules. Anyone would be able to build a module without requiring a license or pay a fee.
One of the big problems with the wearable devices right now is inputs–there’s no simple way to control these devices. Also at a time when most gesture–sensing technology is unreliable and clunky, Project SOLI, one of Google latest cutting–edge experiments from its ATAP group, provides an enticing example of the type of powerful motion controller that could actually change how we interact with everything from smart watches and tablets to appliances and other everyday objects.
Unlike gesture control tech that came before, Google unveiled an interactive sensor that uses radar to translate subtle hand movements into gesture controls for electronic devices. The sensor is able to track sub-millimeter motions at the speedy rate of 10,000 frames per second and with exceptional accuracy. Not only that, but it fits onto a fingertip sized chip and can be used in everyday devices.
Using radar is a fundamentally different approach to gesture tracking because unlike camera-based system which uses a lens, the radar used in Project SOLI will travel through certain materials, making it possible to place chip inside devices and out of sight.
The basic working principle of the system is that it beams out a continuous signal that gets reflected by arm so it measures the difference between the emitted and received signal. It’s a very complex wave signal and from that the system provides signal processing and machine learns the technique to detect the gestures.
The gestures chosen by the team while tests were selected for their similarity to standard action we perform every day. For example, rubbing thumb and finger could be used for volume control,swiping across the side of a closed index finger with the thumb could be used for scroll across a flat plane, while tapping a figure and thumb together would press a button.
Google’s ATAP department is already testing hardware applications for the technology and we can hope to use this technology in near future.
Till now we have heard a lot about the wearable gadgets and might have even used them but how would it be to make the wearable’s that we can actually wear. Yes! Those days are not far. Google’s new Advanced Technology and Project group is trying its hand at manufacturing high-tech fabrics and wearable electronics that you can actually wear with Project JACQUARD and for this it has signed a partnership with Levi Strauss & Co.
Project JACQUARD makes it possible to weave touch and gesture interactively into any textile using standard, industrial looms. Everyday objects such as clothes and furniture can be transformed into interactive surfaces.
Jacquard yarn structures combine thin, metallic alloys with natural and synthetic yarn like cotton, polyester, or silk, making the yarn strong enough to be woven at any industrial loom.
Using conductive yarns, touch and gesture- sensitive areas can be woven at precise locations, anywhere on the fabric. Alternatively, sensor grids can be woven throughout the textile, creating large interactive surface.
The complementary components are engineered to be as discreet as possible. Google developed innovative technique to attach the conductive yarns to the connectors and tiny circuits of size not larger than a button. These miniaturized electronics capture touch interactions, and various gestures can be inferred using machine-learning algorithms.
These captured touch and gesture data can be wirelessly transmitted to the mobile phones or other devices to control wide range of functions, connecting users to online services, apps, or phone features.
Jacquard components are cost-efficient to produce, and the yarns and fabrics can be manufactured with standard equipment used in mills around world.
Connected clothes offer new possibilities for interacting with services, devices, and environments. These interactions can be reconfigured at any time. Further Jacquard is a blank canvas for the fashion industry. Designers can use it as they would and add new layers of functionality to the designs, without learning about electronics.
Though Google is not the first one to create conductive threads, startups like OmSignal and Sensoria are currently selling shirts and running socks that contains such threads and use their electronics innards to track various metrics associated with physical activity. But Google is trying to combine the radar based Project SOLI with this Project JACQUARD which might be a revolutionary concept.