Natural User Interface Group (NUI Group)business
Mailing List: email@example.com
Founded in 2006, Natural User Interface Group or NUI Group is an interactive media group researching and creating open source sensing and display techniques to benefit artistic and educational applications. NUI Group is also a world wide community, which offers a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization. Our current focus is "Open Source Interface", which is solely for accelerating development of existing hardware and sensing solutions, Thus allowing us to find the cheapest and most effective ways to construct our input devices. This project is truly amazing, which attracts a variety of people from around the globe. We are students, researchers, interaction designers, user interface designers, software engineers working on opens source hardware and software solutions. With over 8000 NUI Group members all over the world we are changing ways of human computer interaction. One very important aspect of this project is to create and utilize open standards that allows software development to flourish. For example, we use the TUIO protocol, which is the standard for tabletop communication. Another crucial standard that must be created in an open environment is "Gesture Standards", which allows for fluid interaction across input devices. Our doors are always open and we are looking for new people with similar interests and dreams. We believe that community is more powerful than money or technology. Our main project CCV has opened up computer vision to a new audience lowering the bar for new researchers in the field and we have solely GSoC to thank for this. We are mainly maintaining a set of open source projects including: Community Core Vision (CCV) – cross-platform computer vision framework (GSoC 2008/2009 project) Touchlib - first open source library for multi-touch screen operation working under Linux and Windows Touch'e – multi-touch framework for Mac OS X BBTouch - multi-touch framework for Mac OS X PyMT Paint – multi-touch library and set of applications written in Python (GSoC 2009 project) OpenTouch - WinLibre GSoC 2007 Best Success - multi-touch library for Mac OS X (GSoC 2008 project) TouchAPI- a library which allows for rapid prototyping of multi-touch client applications in Adobe Flash/Flex/AIR or Silverlight/WPF QMTSim – TUIO simulator written in C++ using Qt library (GSoC 2008 project) TUIO Simulator - Java TUIO protocol simulator allowing you to test your multitouch apps without multitouch screen (written by reactivision project team) Other than just mentioned projects, we are also looking forward to work with other open source projects like reactivision, libavg, opentable and many more that are widely created and used by NUI Group members. We have also started to work on multitouch applications for Android and iPad/iPhone/iPod Touch using recently released iPhone SDK and we are waiting for innovative project proposals from community.
- Community Core Audio - A Couple Project with CCV Community Core Audio(CCA) is an openframeworks based solution for voice user interface. CCA manages voice inputs, converts voice to text, and outputs messages to network. CCA will use the similar UI and workflow with CCV. They stay seperate but they can work together. People may use them both and experiment with computer vision and audio. A stand alone openframeworks addon, ofxASR, will be developed as well. ofxASR will be the Automatic Speech Recognition(ASR) engine for CCA.
- Extensions to the MT4J Framework providing generic, stylable components for user interface building Multi-touch 4 Java is a framework which enables users to create multi-touch capable user interfaces. By providing basic components, it enables users to create sophisticated UIs that are completely configurable . Yet the components that are provided, can not be seen as usable by visual artists and user interface prototypers. My goal is to create higher level stylable visible components, that can be intuitively used as building blocks in multi-touch applications.
- NUIML + CCS - State and render management for MT user interfaces This project aims to enhance an existing MT component framework eg. Lux. By separating the user interface from back-end application code, complexity may be reduced and development opened up to multiple languages and platforms. Proposed is a simple markup language designed to streamline user interface specification, and a server-client model to allow remote state management from back-end applications. Basic details are provided to aid understanding.
- PyMT: New and Advanced Interaction Widgets: Text Input While PyMT already contains several widgets and facilities that can be used for UI creation, text input, 3D drawing and user interaction, few of these fully utilize the potential of MT hardware. Actually, most of the already existing widgets are a reimplementation of what we already know from the WIMP world. The goal of this specific proposal is to design and implement advanced text input methods.
- User-Defined Blob Detection And Tracking in CCV Tracking of patterns(blobs) specified by user through another template selection tool. Therefore we will be able to detect and track objects of different shapes or particular shaped devices such as mobiles on the surfaces.