domingo, 23 de febrero de 2014

Discos Duros USB 3.0 ultra slim de última generación

La corporación Western Digital asegura otro segmento del mercado al lanzar su nueva línea de discos duros portatiles de la línea "Passport" o Pasaporte , al incorporar en sus características esenciales la tecnología USB 3.0 que permite un acceso de transferencia de datos de hasta 5GBPS cuando están conectados a un sistema que cuenta con dichos puertos de conexión.

Los nuevos modelos ; My Passport Ultra y My Passport Slim cuentan con una poderosa suite de software para encriptación y manejo de backup dedicado , así como una herramienta de monitoreo del status operacional del disco duro a nivel de hardware, tienen un tamaño bastante reducido y vienen en un chassis metálico que permite la disipación del poco calor que pueda generar la unidad interna al funcionar por largos períodos de tiempo.

Como la mayoría de estos dispositivos, no es recomendable dejarlos caer o someterlos a condiciones extremas de presión o temperatura, vienen con una garantía limitada a función de 3 años e incluyen un pouch o estuche especial para protegerlos a nivel estético.

Aquí pueden revisar más información acerca de estos nuevos discos ultra portatiles :

 Tabla de comparacion de modelos My Passport

Hasta la proxima amigos ! .....

Desarrollo de Video Juegos: 93 Mil Millones durante el 2013

El año 2013 representó un momento importante para la industria del video juego, pues grandes firmas como Microsoft, Sony y Nintendo lanzaron sus consolas de última generación, marcando la extensión del servicio de entretenimientos digitales más alla de experiencia clásica del juego de consola tradicional.

Según un reciente estudio de Gartner Group, el mercado mundial de los videojuegos tocó techo en 2013 con un crecimiento del 18%, para alcanzar un total de 93 mil millones de dólares en ventas durante el 2013. Se preveé que para el 2015 esta cifra alcance los 111 mil millones. La facturación y crecimiento de este sector de la industrial del entretenimiento, ha superado a los de la industria del cine, la música y el video. Lo que resulta aun más impresionante es que el segmento de los video juegos móviles presentan el sector con mas rápido desarrollo. Para el caso de los EEUU, ha pasado de generar un total de 11 mil millones a 21 mil millones en poco menos de 2 años.

Otro aspecto interesante de este fenómeno tecnológico de la industria del video juego, es que no solo tiene influencia en el aspecto económico, sino que además impulsa de manera integral el desarrollo de las tecnologías que actualmente convergen para generar la experiencia demandada por el usuario. Con esto me refiero a cuatro elementos fundamentales: capacidad de procesamiento computacional, contenido, dispositivos y ancho de banda. El software de entretenimiento es el responsable de generar una porción importante de todas las ventas y la innovación en cada de estas industrias relacionadas. 

Otro segmento que no escapa a los efectos del desarrollo y crecimiento de esta industria, es el mercado laboral. Hace unas semanas salio un estudio del mercado laboral de la industria de los videojuegos (Gamasutra Salary Survey). El objetivo de este estudio fue evaluar el rango de ingresos actuales de los diferentes trabajadores que integran y conforman la industria de los videojuegos. El estudio analizo el impacto de la crisis economica en el sector desde el punto de vistas de los ingresos de los asalariados del mercado laboral de los videojuegos. La conclusion es que el ingreso medio a nivel sueldos crecio una media del %7 durante el 2013 ascendiendo a los USD 79,000 anuales (durante el 2012 el ingreso medio llegaba tan solo a los USD 74,000).

¿Quienes tendrán los salarios mas altos? ¿Los programadores? ¿Los diseñadores? ¿Tal vez el equipo de ventas? Bueno, si, claramante la parte comercial son los que se llevan la mayor tajada. No obstante, desde el punto de vista de desarrollo de software, que es la parte que nos interesa, encontramos los siguientes resultados:
  • Programación: Representan algunos de los talentos mejor pagados en la industria del video juego, el salario promedio de los programadores aumento hasta 92,962 dólares comparado con 85,733 dólares respecto a años anteriores . 
  • Arte y Animación: Para los artistas y animadores los salarios medios aumentaron a 75,780 comparados con 71,354 con respecto del año anterior. 
  • Diseñador: Los diseñadores de juegos , escritores y directores creativos promediaron un paquete de 73,386 frente a 70,223 respecto al año anterior.
  • Tester y Quality Assurance: Los profesionales de control de calidad (QA testers) son los trabajadores con salarios más bajos en la industria de los juegos, su salario promedio disminuyó a 47,910 dólares comparados con 49,009 dólares del año anterior.
  • Negocios: para empresas y trabajadores legales siguen siendo los mejor pagados en la industria, pero sus salarios promediaron los 106,452 en 2010.
Caso República Dominicana:

En la caso especifico de la Republica Dominicana, apenas se empienzan a dar los primeros pasos para el incentivo en la generacion de empleos y la creacion de industrias en esta indole. VAP Dominicana es la primera empresa de talento local que se propone hacer “outsourcing” de videojuegos desde el país. Será una especie de zona franca que fabricará partes de videojuegos para exportarlas por encargo a estudios de Estados Unidos y Europa, donde terminaran de “ensamblarse”.

Con un capital semilla de US$100 mil para el primer año, el proyecto es uno de los 12 que conduce la incubadora de negocios de tecnologia Emprende, ubicada dentro del Parque Cibernetico. VAP se creó en respuesta a la propuesta del que será su primer cliente, Trilogy Studio, cuyo presidente, Michael Pole, visitó el país en 2006. Despues de dos fructiferas reuniones con casi un centenar de jovenes que presentaron videojuegos de calidad hechos con herramientas “rusticas”, Pole ofrecio contratar a la empresa que se animara a fabricar los videojuegos para el estudio. Los fundadores de Trilogy son ex empleados de Electronic Arts que hace tres años fundaron su propia empresa y dejaron el espacio donde crearon videojuegos tan famosos como Halo3 y Medal of Honor. La apuesta del estudio es desarrollar los mundos virtuales, que son videojuegos de bajo costo que cobran una suscripcion tipo World of Warcraft.

La empresa emergente se propone producir US$20 millones dentro de cinco años y capacitar, en ese periodo, entre 150 y 200 personas que podrán obtener un salario que evolucionará de un promedio de US$15mil a US$40 mil por año.

El “outsourcing” es para la industria del videojuego como el cemento y la varilla para un edificio. El mercado se estima en 30% de la industria y genera empleos en areas especializadas como programación, guión, diseño, creación de personajes y música. 

Los paises lideres en “outsourcing” de videojuegos son China, Irlanda, Europa del Este e India. “Para las empresas en Estados Unidos es complicado tener 13 horas de diferencia de horario con China y ciertas diferencias culturales y de idioma, que no permiten la comunicacion constante que este tipo de trabajos exige”. Desde el 2006 la República Dominicana, aspira a capturar una parte de ese mercado por la posicion estratégica del país y fomentar que se desarrollen otras empresas similares que motoricen esa actividad en el pais, junto con la animación para publicidad y cine. 

Pero aun falta superar la principal limitante que originalmente tiene la empresa: conseguir personal capacitado para alcanzar los niveles de produccion necesarios. Instituciones educativas nacionales deben desarrollar estrategias para crear un mecanismo efectivo para reclutamiento de talentos y formacion en las areas requeridas por la industria, especialmente en el componente de desarrollo de software complejo. Pero sobre esto abundaremos proximamente en otro post. Segun las demandas actuales los jóvenes interesados en incursionar en la industria del desarrollo de juegos, deben dominar una variedad de tecnologías entre las que se encuentran: C++, Java, OpenGL, DirectC, Blender, Maya, Photoshop, entre otros.

jueves, 13 de febrero de 2014

Algoritmos de Filtrado Colaborativo para Sistemas de Recomendación Automática

En los Sistemas de Recomendación existen dos paradigmas para la selección de elementos, basados en contenido y filtrado colaborativo. En los sistemas basados en contenido el usuario recibirá información similar a la que ha mostrado interés en el pasado, mientras en el filtrado colaborativo las sugerencias serán de elementos que han gustado a gente con intereses similares a los suyos.

En la literatura existente se describen los Sistemas de Recomendación basados en Filtrado Colaborativo (FC) como sistemas que trabajan recogiendo juicios humanos, expresados como votaciones, sobre una serie de ítems en un dominio dado, y tratan de emparejar personas que comparten las mismas necesidades o gustos [Herlocker et al. 1999; Pazzani 1999; Adomavicius and Tuzhilin 2005; Breese et al. 1998].  Los usuarios de un sistema colaborativo comparten sus valoraciones y opiniones con respecto a los ítems que conocen de forma que otros usuarios puedan decidir qué elección realizar. A cambio de compartir esta información, el sistema proporciona recomendaciones personalizadas para aquellos elementos que pueden resultar interesantes al usuario.

El proceso básico es hacer un esquema de concordancia entre la información que se tiene del perfil del usuario actual y los perfiles de otros usuarios que se tienen almacenados y de cuyas referencias se tiene conocimiento, a esto se le conoce como “filtrado colaborativo de vecindad más cercana”.



 Los algoritmos FC pueden ser agrupados en dos clases generales [Adomavicius and Tuzhilin 2005; Breese et al. 1998]: los basados en memoria, que se basan en una vecindad completa de usuarios y sus valoraciones para el cálculo de predicciones [Herlocker et al. 1999; Adomavicius and Tuzhilin 2005], y los basados en modelos, que usan esas valoraciones para aprender un modelo que será el usado para predecir [Ungar and Foster 1998; Kim and Yum 2005; Breese et al. 1998]. La información manejada en FC consta de una serie de ítems, usuarios y valoraciones proporcionada por los usuarios sobre esos ítems: el espacio del problema viene definido como una matriz de usuarios frente a ítems, en la que cada celda representa la puntuación de un usuario concreto referida a un ítem específico. 



Resolver un problema típico de FC implica predecir qué valores tendría un usuario para aquellos ítems que aún no ha puntuado, basándonos para ello en las valoraciones aportadas anteriormente por la comunidad de usuarios [Adomavicius and Tuzhilin 2005; Herlocker et al. 1999].

Sistemas de filtrado
Existen diversas formas de realizar un filtrado de información, dependiendo del algoritmo de aprendizaje empleado. Según [Vélez y Santos, 2006] existen dos formas de realizar un filtrado de información:

  •  Filtrado Colaborativo: se basa en las calificaciones que realizan los usuarios sobre un dominio.
  •  Filtrado de Contenido: se basa en el enfoque tradicional de recuperación de información por palabras claves.


El Filtrado Colaborativo se puede realizar aplicando diversas formas algorítmicas:

Algoritmo de Horting: Técnica basada en grafos en la cual los nodos son los objetos y las aristas entre nodos son indicadores de los grados de similitud entre dos objetos. Las predicciones se producen al recorrer el grafo entre nodos cercanos y combinando las informaciones entre objetos cercanos. 

Redes Bayesianas de Creencia: Las Redes Bayesianas de Creencias (RBC) también se  conocen como Redes de Creencias, Redes Probabilísticas Causales, Redes probabilísticas Gráficas. Una RBC es una red gráfica que representa relaciones probabilísticas entre variables. Las RBCs permiten razonar bajo incertidumbre y combinar las ventajas de una representación visual intuitiva con una base matemática en la probabilidad bayesiana P(A/B) = P(A,B)/P(B)

Similitud Basada en el Coseno: Esta similitud da una buena medida del “parecido” de dos  vectores en un espacio multidimensional, el espacio puede describir características de usuarios o  de ítems, tales como palabras claves. La similitud entre ítems es medida computando el coseno entre el Angulo entre estos dos, mediante la ecuación:

Redes Neuronales: Las Redes Neuronales (RN) proporcionan una forma muy conveniente de representación del conocimiento, donde los nodos representan objetos del proceso de recuperación de información como palabras claves y los enlaces representan la asociación ponderada de estos (relevancia). Las RN aplicadas al filtrado colaborativo son de reciente uso, en [Nasraoi, 2004], se desarrolla una aplicación de predicción de URLs que se dan como recomendación a los usuarios, según su perfil.

Correlación de Pearson: Es una métrica típica de similitud entre funciones de preferencias de usuarios o distancias de vectores. Los vectores comparados coinciden en una escala desde cero (no similares) a uno (coincidencia total), y -1 (diferencia total)


Desde el punto de vista científico-técnico, esta propuesta pretende abordar parte de los retos especificados como mejoras posibles a los mecanismos de filtrado. Inicialmente, nuestro modelo de sistemas estará basado en la Teoría de Vínculos Débiles de Granovetter, el cual afirma que el grado de coincidencia entre dos sistemas individuales varía directamente según la fuerza que los une o vincula entre sí. Nuestra decisión se fundamenta en el hecho de que la mayoría de los modelos sistémicos de filtros colaborativos emplean modelos de unión de lazos fuertes. Otro fallo fundamental de los actuales modelos existentes es que no relacionan de forma convincente las interacciones a un nivel micro con los modelos de nivel macro. Estudios estadísticos, al igual que cualitativos, ofrecen una buena muestra de investigación acerca de este fenómeno.

 Referencias:

[Adomavicius and Tuzhilin, 2005] Adomavicius, G., and A. Tuzhilin. 2005. Toward the next generation of recommender systems: A survey of the state-of-the-art and possible extensions. Ieee Transactions on Knowledge and Data Engineering 17 (6):734-749.

[Herrera-Viedma et. al., 2003] E. Herrera-Viedma, L. Olvera, E. Peis, C. Porcel. 2003. Revisión de los sistemas de recomendaciones para la recuperación de información. Tendencias de investigación en organización del conocimiento. Trends in knowledge organization research, José Antonio Frías, Ed. Críspulo Travieso, Universidad de Salamanca, 507-513.

[Herlocker et al. 1999] Herlocker, J. L., J. A. Konstan, A. Borchers, and J. Riedl. 1999. An algorithmic framework for performing collaborative filtering. In Sigir'99: Proceedings of 22nd International Conference on Research and Development in Information Retrieval, edited by M. Hearst, F. Gey and R. Tong. New York: Assoc Computing Machinery, 230-237.

[Ungar and Foster, 1998] Ungar, L. H., and D. P. Foster. 1998. Clustering Methods for Collaborative Filtering. Paper read at Proceedings of the Workshop on Recommendation Systems.

[Velez & Santos, 2006] Velez, O., C. Santos. 2006. Sistemas Recomendadores: Un enfoque desde los algoritmos genéticos. Industrial data, año/vol 9, número 001. Universidad Nacional Mayor Dan Marcos, Lima, Perú. 23-31.

Redes de Colaboración Científica para la Investigación Académica

En el siglo XXI y en los sucesivos siglos, la colaboración será fundamental para llevar a cabo proyectos de gran envergadura en cualquier ámbito y en especial cuando se trata de proyectos en ciencia y tecnología.

Tradicionalmente las instituciones de educación superior de la República Dominicana, han trabajado bajo un espíritu de competencia. Fenómeno que responde a la relativa juventud de la mayoría de las universidades, las cuales se encuentran en una etapa de consolidación y posicionamiento a nivel nacional y regional. Esta necesidad de diferenciación inicial, genera poderosas barreras de integración y de movilidad interinstitucional.

En la realidad dominicana la competencia dentro de los mercados de educación superior es, ante todo, de carácter posicional, en un doble sentido. En los niveles más altos, las instituciones compiten por los estudiantes más preferidos y los estudiantes compiten por oportunidades prestigiosas (Instituciones con alta reputación, selectivas, alta calidad, etc.) En cambio, a medida que se desciende en la jerarquía de las instituciones, la competencia adquiere un sentido distinto y se transforma, básicamente, en competencia por matrícula, y no calidad. En la parte más baja del mercado, las universidades dominicanas ya no pueden preocuparse de a quien ofrecen su servicio sino que deben actuar con una política de puertas abiertas y competirán simplemente por captar alumnos.

No obstante, las exigencias de los modelos educativos de las sociedades que intentan insertarse en las corrientes del desarrollo, demandan la generación de conocimiento a través de la investigación y las sinergias que produce una institución de múltiples propósitos y múltiples relaciones.

Las investigaciones en la República Dominicana están esparcidas a lo largo de las diferentes instituciones de educación superior y centros de investigación donde los especialistas trabajan de manera individual (islas científicas), privándose de la oportunidad de compartir logros y resultados que al aunar esfuerzos podrían generar aportes significativos al desarrollo de la ciencia y la técnica en el país. 

A pesar de las oportunidades que brinda el escenario nacional para acceder a conocimientos avanzados, desarrollar estrategias y participar en proyectos de investigación, existen algunos aspectos que impiden un desarrollo adecuado de las líneas de investigación en el país. El principal problema que debemos resolver para poder integrarnos al avance actual de la ciencia y la tecnología, es la colaboración interna. De ahí la conveniencia de aunar esfuerzos propios en la producción de conocimientos científicos y tecnológicos. 

Como lo ha planteado el Ministerio de Educación Superior Ciencia y Tecnología, la creación de redes de investigación y desarrollo es fundamental para potenciar el impacto de los resultados obtenidos de los proyectos de investigación, de ahí la importancia de la creación de herramientas que faciliten la vinculación y creación de estos grupos de profesionales con “intereses comunes y/o complementarios” en la investigación.

Nos encontramos ante una gran oportunidad de avanzar en los campos de la ciencia y la tecnología en el plano nacional y regional. El gran reto es como convertir en conocimiento útil la avalancha de información que se propaga a través de los diferentes medios de comunicación y como aprovechar el proceso de generación y apropiación del conocimiento para inducir procesos dinámicos de cambio social, a través de los cuales el conocimiento crea y fortalece capacidades y habilidades de personas u organizaciones que se lo apropian, convirtiéndose en un factor de cambio. Más importante aún, como aunar esfuerzos aislados en función de un interés científico colectivo que permita logros de mayor impacto a través de la colaboración interinstitucional.

En este sentido, la conformación de redes científicas mediante redes digitales de información está llamada a jugar un papel importante en los procesos de generación y apropiación del conocimiento.

Está demostrado que la vinculación de instituciones con actividad investigativa a través de redes, permite uno de los mayores flujos de cooperación e intercambio de información. Fomentar la creación de redes académicas de investigación en el ámbito nacional, introduce un componente dinámico que favorece las interacciones entre los diferentes actores. Estas redes proporcionan un mecanismo ideal para aquellos actores que se encuentran aislados, incluso en regiones con menor desarrollo científico, tecnológico o social. Este problema de asimétrica distribución de capacidades científicas y tecnológicas está presente en mayor o menor medida en todo el entorno regional. Por eso las redes que localicen y asocien individuos con intereses comunes, constituyen una alternativa para aliviar este problema. Especialmente en países como República Dominicana, donde existe una masa crítica insuficiente y debilidades en los grupos de investigación y desarrollo.

Este tipo de soluciones, permite no solo aprovechar la masa crítica existente, sino además, desarrollar sinergias derivadas de la colaboración entre grupos de investigación para abordar temas y proyectos de mayor envergadura y complejidad, mayor impacto científico, tecnológico, económico y social. 

Investigación en el contexto de Educación Superior en la República Dominicana

La ciencia y la tecnología son reconocidas actualmente como factores decisivos para la transformación económica y social, no sólo para los países industrializados, en los cuales se pone de manifiesto el surgimiento de una nueva economía del conocimiento, sino también para los países en vías de desarrollo. Este hecho hoy resulta muy evidente en el contexto de una revolución científica y tecnológica que domina la escena internacional y se ha convertido en un dato político y económico de primera magnitud. La República Dominicana no es ajena a este fenómeno, y lo demuestra a través de la aplicación de distintas iniciativas gubernamentales orientadas a impulsar la producción científica local. Como ejemplo concreto tenemos el Fondo Nacional de Innovación y Desarrollo Científico y Tecnológico (Fondocyt) dedicado a desarrollar y financiar actividades, programas y proyectos de innovación tecnológica e investigación científica aplicada apuntando a establecer un sistema de promoción permanente de la investigación científica y tecnológica nacional. 

A pesar de las oportunidades que brinda el escenario nacional para acceder a conocimientos avanzados, formar recursos humanos y desarrollar actividades de carácter científico, existen algunos aspectos que impiden un desarrollo adecuado de las líneas de investigación en el país. Uno de los principales problemas que debemos resolver para poder integrarnos en el avance actual de la ciencia y la tecnología, es la colaboración interna entre entidades y personal académico local en el proceso de producción de conocimientos científicos y tecnológicos. 

Dado el crecimiento y la incidencia del Internet y las capacidades que brinda para desarrollar procesos asociativos, representa este una herramienta ideal para el diseño de modelos que transformen la red de un espacio de información a un espacio de conocimientos y colaboración distribuida. Esta capacidad ha dado pautas a una auténtica revolución en las formas de llevar a cabo investigación, redundando en avances en el conocimiento y, más aún, en el uso social del mismo.

miércoles, 12 de febrero de 2014

Recuerdos de COMPUEXPO - República Dominicana

Toda una generación de niños y jóvenes que hoy ocupan posiciones gerenciales en importantes instituciones nacionales e internacionales o que han desarrollado sus propias empresas, crecimos teniendo a COMPUEXPO como el evento esperado por todos aquellos adeptos a los avances tecnológicos. En una era donde las comunicaciones apenas se empezaban a desarrollar, épocas que solo podíamos leer en revistas y libros que llegaban a las bibliotecas o traía algún conocido o que escuchábamos sobre tal o cual tendencia, producto, servicio o nuevo dispositivo, no era sino hasta octubre de cada año que podíamos ver, palpar y conocer a través de la COMPUEXPO.

A finales de la década de los 80 y principio de los 90, en un COMPUEXPO de turno, se presentó a los ejecutivos dominicanos la maravilla del FAX (facsímil): la solución de negocios capaz de transmitir copias de documentos a través de las líneas telefónicas. ¡Parecía magia! ¡Cuanto no hemos visto a partir de ahí! En mi caso particular, gracias a un evento de COMPUEXPO pude tocar por primer vez un computador y aprender con apenas 9 años, mis primeras instrucciones de programación en lenguaje LOGO, en unos de los salones del Hotel Concorde, en aquellas famosas computadoras Tandy 1000.

Luego, año tras año conocimos el modem, el mouse, las interfaces gráficas, los chats punto a punto, los servicios de redes de área local, los servidores, monitores a color, los video juegos, las comunicaciones móviles, la Internet, los GPS; solo por mencionar algunas de las que ahora aparentan ser simples aplicaciones pero que en su tiempo resultaron ser la novedad del momento.

No obstante, la tecnología como tal o más bien el uso de la tecnología, puede representar una amenaza cuando no se la comprende, pues se convierte en patrimonio de un grupo limitado de expertos a sueldo. Pero a la vez, representa una oportunidad de democratización cuando aumenta en gran medida el número de personas que si la comprenden. 

Nuevos retos nacen de manera constante, detrás de los avances que emergen cada día frente a la tecnología, la medicina, el medio ambiente y otros muchos aspectos de nuestra vida cotidiana. Al mismo tiempo, estos nuevos retos, demandan cada día más de recursos humanos con talentos excepcionales tales como: valores humanos, originalidad, la ética, la pasión y creatividad dentro de un área del ejercicio específico.

Bajo este contexto, cito a continuación las palabras del Mons. Núñez Collado, publicadas en su libro Computación y Educación Superior, escrito en 1986:

“Es evidente el influjo positivo de la tecnología sobre el progreso humano, pero no podemos perder de vista que en la etapa de desarrollo que vive el mundo, el desarrollo tecnológico en general y la ciencia de la información en particular podrían ejercen un impacto negativo, sino introducimos en el mundo tecnológico, ingredientes morales, valores trascendentes y un fin humano válido en si mismo que mantenga en el hombre su sentido de la equidad, de la dignidad y de la justicia. El mundo de mañana dependerá más de sus preceptos morales que de su abundancia en bienes materiales o de instrumentos de dominación y tendrá como ingredientes imprescindibles la dignidad individual y colectica, la capacidad de pensar, de decidir y actuar con libertad, con responsabilidad y nobleza espiritual.”

En el futuro inmediato, nuestra sociedad dominicana, enfrenta el reto de educar a los ingenieros del futuro, solucionadores de problemas, creadores de nuevo conocimiento, capaces de llegar más allá de simples implementaciones tecnológicas para satisfacer las necesidades humanas y sociales, e inspirados por la sed de innovación.

No obstante la intensión declarada, las labores para poder llevar a cabo esta misión se caracterizan por la obligación de lidiar con una nueva generación influenciada por:

  1. La realidad de que estamos viviendo en tiempos exponenciales. “Si a lo largo de los últimos 25 años la industria aeronáutica hubiese experimentado la espectacular evolución que ha vivido la informática, un Boeing 767 costaría hoy 350 dólares y circunvolaría el globo terrestre en 20 minutos, consumiendo o unos 20 litros de combustible.”
  2. Contexto de un mundo (sociedad) en estado de crisis de valores, limitado en sus recursos naturales y matizado por la desigualdad y la tensión social.
  3. Estamos formando individuos de generaciones que pertenecen a una sociedad móvil, acostumbrada a lo efímero y que ve el capital humano como un activo personal y no institucional. Por ende, no están dispuestos a hacer carrera y servir toda su vida en una institución. No buscan ser formados para ser empleados, quieren ser emprendedores.
  4. La cantidad de información nueva y relevante que se generará en el 2009 es de 4 exabytes (18 ceros). Esto quiere decir que la cantidad de información técnica nueva, se duplica cada 2 años. Para los estudiantes que inician una carrera de 4 años, 50% del conocimiento adquirido durante el primer año, será obsoleto antes de terminar el tercer año de estudios.
  5. Según un estudio del Departamento de Trabajo de los EEUU, los 10 puestos técnicos más solicitados en el 2009 no existían en el 2004. Esto se traduce en la necesidad de preparar estudiantes para tareas que aun no existen, usando tecnologías que una no han sido inventadas y para resolver problemas que aun no sabemos que son problemas.

Formal Methods and Project Planning for the Software Process

From my experience as a software developer and complex software projects managers, I had learn over time that to successfully execute software project besides having a hard working team, it is essential to count with a clearly defined plan that all parties understand and endorse. 

Over the past years we had tried to align our administrative process to comply with CMMI standards. However, in practice, being a medium size company achieving this intention is very expensive, time consuming and doesn’t pay off from the customers perspective. Small and medium software development companies represent the largest segment in the software development industry in the US.

I believe that there are real values to following the process, but I also think whether companies follow their CMMI processes depend in large part on their customers. In the federal government market, I found that it is a requirement in the Request for Quote, but the customer doesn't really want to pay the costs associated with it. Formal models such as CMMI have been developed without taking into account small businesses and their limitations; neither has been adapted to facilitate its adoption. This brings to my attention the following questions:
  • How to calculate / to evaluate the return-on-invest of introducing CMMI in small companies and specifically Project Planning practices.
  • How many developers must exist so that process improvement and project management with all its components (PP, PCM, etc.) saves money in a feasible time frame? 
  • Is it necessary (from an official CMMI point of view) to introduce expensive tools or is it possible that small companies realize CMMI level2 without any tools?
  • Is marketing the major reason why companies invest in this certification or are there real values in following the CMMI processes?
  • Do companies actually follow CMMI processes after certification?
The Reality 

Process improvement in small enterprises is a problem that has been studied with more interest since 2005 (Mondragon, 2006). Process improvement in small enterprises is naturally limited by the constraints of small businesses:
  1. Company Cash Flow: Proper cash flow instantiates resources on a process improvement project. In small companies (fewer than 25 employees) normally the agenda of technology experts are on assigned above 100%. In many cases small businesses are not competent when making estimates of effort and performance expectations of the team, neither on planning or formally managing it projects.
  2. People Skills: People with higher education usually have developed analytical thinking skills much stronger than people who do not have this training. The development of training guides and process guidelines are required to deploy a complete solution for process improvement.
  3. Project Size: Project size is a variable that directly affect the amount of communication, information and skills needed for proper performance. In large projects the software engineering practices become essential to produce work that meets the objectives to meet the requirements, meet the schedule, respecting the project's cost, provide the expected quality and achieve the expected productivity (Goldenson, 2010). In this particular aspect, project planning has a determinant role in regards to the project success.
Project planning done right can bring peace of mind and even outright relief to the most complex projects. On the other hand, project planning done wrong is easy to detect: The weeks and months of delays, a blown budget, angry clients and likely bad ending.

Many things lead to project success and many other leads to failure. Successful project depends on a combination of many variables including practices, experiences, methodologies, internal and external factors, etc. However, we can conclude that among those important variables, appropriate Project Planning is one of the primary indicators for high chances to succeed in a project.

PP Process Area requires excellent forward planning, which includes detailed planning of the process implementation stages, task timeliness, fallback positions, and re-planning. Initial planning is not enough. Projects often take wrong turns, or initial solutions prove unfounded. The project manager who does not prepare to re-plan, or has not considered and planned fall-back positions when initial plans fail, will often find that the project first stalls, and then fails. We must remember that project management is not a straight line process, but an iterative process that requires agile rethinking as the known environment changes before your eyes (Anil, 1991). 

Project failure is preventable with good project planning based on a well-constructed deliverables-based Work Breakdown Structure and proper controls. There may be some casualties along the way, such as some reduction in scope, additional time, and/or additional cost, but with good project planning and timely intervention where required, these can be minimized.

Finally, going formal represent a big step for small companies but is a decision that requires some sacrifices in working time and money. However this should not be an excuse for not taking the advantages of the formal methodologies. Implementing formal project management and proper planning could be a very good first step. After all, small business should ask themselves: How are we going to eat this elephant (CMMI)? The only possible answer is simple: In small bytes! As stated in the Practices topic, start crawling, follow these guidelines and before you know you might be walking.

It is important for a good manager, to be knowledgeable of this techniques and methodologies. Planning ahead is the best medicine. Prevention is the best of all cures.

References

Anil, Iyer and Thomasson, David (1991). “An Empirical Investigation of the Use of Content Analysis to Define the Variables Most Prevalent in Project Successes and Failures”, Proceedings of the 1991 PMI Annual Seminar/Symposium.

Mondragon, O. (2006). “Addressing infrastructure issues in very small settings. In Proceedings of the First International Research Workshop for Process Improvement in small Settings” Software Engineering Institute, Carnegie Melon University.

Goldenson, Dennis Herbsleb, and James (1995). “After the appraisal: A systematic survey of process improvement, its benefits and factors that influence success”. Technical Report CMU/SEI-95-TR-009, ADA302225, Software Engineering Institute, Carnegie Mellon University.

Outsourcing & Insourcing: Current Trends for the Software Market

Enhancement on the agreements on international trade, the incorporation of new countries to global economic cycles, the increase in air traffic, the exponential increase in the quality and bandwidth of telecommunications, the disclosure of internet culture around the globe are all factors that are dramatically changing markets in all countries. Increased competition has led more companies to seek the expertise of all its processes. Focusing on core business and move to a third party support functions result in a high impact on costs and quality of services. This is the way that outsourcing promises.

Out-Sourcing Trends 

Enhancement on the agreements on international trade, the incorporation of new countries to global economic cycles, the increase in air traffic, the exponential increase in the quality and bandwidth of telecommunications, the disclosure of internet culture around the globe are all factors that are dramatically changing markets in all countries. Increased competition has led more companies to seek the expertise of all its processes. Focusing on core business and move to a third party support functions result in a high impact on costs and quality of services. This is the way that outsourcing promises.

This service is an analogy of industrial production processes, can reduce risks in the construction and maintenance of software projects, provides direct benefits on the reliability and satisfaction of the delivered products, providing a clearer budget and timetable more limited projects.

This concept of service allows for the optimization of resources, technological potential and the advantages gained through economies of scale and improved cost-benefit ratio. It can be applied to the full development of new projects or some modules, as well as for the maintenance of production systems. There are three working schemas that could be applied under this model: Complete Project Development, Functional cases (parts or modules of a project) and Resource / Day-Hour (specific functions).

Clients who outsource software to outside providers are expecting nothing less than great quality, as the IT development outsourcing scene matures. After about a decade of growth, it is time for superior customer service, reliable organization, modern management and, most important of all, top-notch solutions. Knowing that they can turn to hundreds of other outsourcing firms competing for their budgets, clients are likely to reward not just affordable prices, but sustainable high quality (ClearCode, 2011).

New concepts are also emerging within the software outsource business, and along with these new concepts new forms of business appear. That is the case of nearshoring, farmshoring and cloudshoring. Each of these terms means outsourcing to a nearby country, to a rural area or moving operation to an IT cloud (For computing power, storage, bandwidth, processing, etc) respectively. According to the forecasting made by Clear Code in it article “Software Outsourcing Trends for 2011”, in 2011, these ideas will make it possible to cut costs further, as well as improve management and control over third-party contract execution.

Cloud sourcing which has often been predicted as the death of outsourcing, will soon merge with the existing outsourcing market and provide better opportunities for the entire industry. Infrastructures supported by cloud resources and based on SOA principles will encourage smaller outsourcing providers, which will in turn energize the outsourcing market by heightening competition and lowering prices (OutSource2India, 2011).

International outsourcing of services has increased in the United States but still remains low, based on our economy-wide measure using International Monetary Fund trade data. Imports of computer software and information plus other business services as a share of GDP were only 0.4 percent in 2003. This share has roughly doubled in each decade; from 0.1 percent in 1983 to 0.2 percent in 1993 and to 0.4 percent in 2003. The United Kingdom has a higher outsourcing ratio than the United States at 0.9 percent in 1983, 0.7 percent in 1993, and 1.2 percent in 2003 (Amiti, 2004).

Finally Industry experts predict the emergence of a Latin America outsourcing boom especially in Brazil, Mexico, Chile, Colombia, Costa Rica and Peru. Service providers will also continue to shift their delivery centers to markets such as China, Philippines and Egypt, since these countries represent big markets with big demand for transformational and discretionary spend activity (OutSource2India, 2011).

In-Sourcing Trends 

The opposite of outsourcing can be defined as insourcing. When an organization delegates its work to another entity, which is internal yet not a part of the organization, it is termed as insourcing. The internal entity will usually have a specialized team who will be proficient in the providing the required services. Organizations sometimes opt for insourcing because it enables them to maintain a better control of what they outsource.

Organizations involved in production usually opt for insourcing in order to cut down the cost of labor and taxes amongst others. The trend towards insourcing has increased since the year 2006. Organizations who have been dissatisfied with outsourcing have moved towards insourcing. Some organizations feel that they can have better customer support and better control over the work outsourced by insourcing their work rather than outsourcing it. According to recent studies, there is more work insourced than outsourced in the U.S and U.K. These countries are currently the largest outsourcers in the world. The U.S and U.K outsource and insource work equally (OutSource2India, 2011 ).

Professor Matthew Slaughter from Dartmouth College, presented a study about the Insourcing Market in the USA. His findings remarked the following trends:
  • Insourcing companies employed over 5.4 million U.S. workers. This was nearly 5 percent of the private-sector total employment up from just 3 percent in 1987.
  • The share of U.S. private-sector capital investment accounted for by insourcing companies rose from over 8 percent in 1992 to over 10 percent—$111.9 billion.
  • For many years insourcing companies have accounted for around 20 percent of U.S. exports of goods—now $137 billion.
  • Insourcing companies paid their American workers over $307 billion in compensation. This was more than 6 percent of all U.S. private-sector labor compensation.
Pros and Cons

This form of contracting has its promoters and defenders, but also its detractors. Among the arguments against the sub-contracting, opponents mentioned:
  1. Professional employees or sub-contractors may not have a sense of loyalty to the company contracting the service because; in fact these senses belong to the contractor.
  2. That working conditions in which these workers are not usually play best as, for example, are hired on a temporary basis but the workflow is continuous. Critics of sub-contracting system argue that this figure is a contractual covert abuse labor rights.
  3. That the system of outsourcing often eliminates jobs in the local labor market.
On the positive side, outsourcing is claimed to:
  1. Allow to obtain products and services of better quality elsewhere if they are not found in the local market.
  2. Reduce production costs.
  3. Reduce the number of routine tasks in the contracting company and allows employees to focus on more creative and productive aspects of the task.
On regards to CMM and outsourcing, the market has a sense of pressure that if the outsource providers are not CMM certified, the customers will doubt when giving out their projects. However, according to Mark Hillary and his article “CMM might be mature, but is it adapted”, the CMM models do not yield to a better quality, mostly because most of the smaller companies are not even equipped to provide their offshore suppliers with the required inputs in terms of specifications, validation, etc. This consultant also reported that the relationship he is setting with his customers does not touch on CMM (although they have the accreditation), but rather revolves about the frequency of communication, the quality of deliverables, mixed teams with people on both sides of the ocean, etc.

Immigration policies for foreign IT graduates 

According to a news release from the US Immigration and Customs Enforcement, ICE announced an expanded list of science, technology, engineering, and math degree programs that qualifies eligible graduates to extend their post-graduate training.

The current administration of Presidents Obama had reiterated their decision and strong support, as a part of comprehensive reform, for new policies that embrace talented students from other countries, who enrich the nation by working in science and technology jobs in the United States.

This reform includes the expansion of the degrees and fields that are considered important for the US economy. The list includes a comprehensive relation of career related to mathematics, high tech and computer science. According to the US Labor Office, these areas are suffering from a shortage of skilled workers. Again, the Obama administration is helping to address shortages in certain high tech sectors of talented scientists and technology experts-permitting highly skilled foreign graduates who wish to work in their field of study upon graduation and extend their post-graduate training in the United States.

Under the Optional Practical Training (OPT) program, foreign students who graduate from U.S. colleges and universities are able to remain in the U.S. and receive training through work experience for up to 12 months. Students who graduate with one of the newly-expanded STEM degrees can remain for an additional 17 months on an OPT STEM extension (US Immigration Office, 2011).

References

Amiti, .M. (2004). “Fear of Service Outsourcing: Is It Justified?”. Working Paper. International Monetary Fund.

Clear Code. (2011). “Software Outsourcing Trends in 2011”. Visited on May 17, 2011. Online  at: http://clearcode.cc/2011/01/17/software-development-outsourcing-trends-2011/

Hillary, .M. (2007). “CMM might be mature, but is it adapted?”. Visited on May 16, 2011. Online at: http://www.it-outsourcing-china.hyveup.tv/2007/05/cmm-might-be-mature-but-is-it-adapted/

Kirkegaard, F.( 2004). “Outsourcing-Stains on the White Collar?”. Institute for International Economics.

OutSource2India, (2011). “The Future of OutSourcing”. Visited on May 12, 2011. Online at: http://www.outsource2india.com/trends/future_outsourcing.asp

Slaugther, .M. (2006). “Insourcing Jobs: Making the Global Economy Work of America”. Tuck School of Business at Dartmouth. 

US Immigration Office (2011). News Release. “ICE announces expanded list of  science, technology, engineering, and math degree programs Qualifies eligible graduates to extend their post-graduate training”. Visited on May 12, 2011. Online at: http://www.ice.gov/news/releases/1105/110512washingtondc2.htm

The Human Body as a Computing Interface

< Interface /ˈint-ər-ˌfās/: The point of interconnection between two entities.>


Interfaces take places into our lives in the form of the various devices, analog or digital, with whom we normally establish some kind of interaction. This means that the interfaces are "tools" extenders for our bodies, such as computers, cell phones, elevators, etc. The concept of interface is applicable to any situation or process where the exchange or transfer of information takes place. Some of the ways of thinking to the interface might be like “the area or place of interaction between two different systems not necessarily a technological system”. Traditional computer input devices leverage the dexterity of our limbs through physical transducers such as keys, buttons, and touch screens. While these controls make great use of our abilities in common scenarios, many everyday situations command the use of our body for purposes other than manipulating an input device (Saponas, 2010, p. 8). Humans are very familiar with their own body. By nature, humans gesture out their body parts to express themselves or communicate ideas. Therefore, body parts naturally lend themselves to various interface metaphors that could be used as interaction tools for computerized systems.

For example, imaging rushing to a class while wearing gloves in a very cold morning, all of the sudden you have to place a phone call to your classmate to remind him to printout a homework, dialing a simple call on a mobile phone’s interface within this situation can be difficult or even impossible. Similarly, when someone is jogging and listening to music on a music player, their arms are typically swinging freely and their eyes are focused on what is in front of them, making it awkward to reach for the controls to skip songs or change the volume. In these situations, people need alternative input techniques for interacting with their computing devices (Saponas, 2009, p. 4).

Appropriating the human body as an input device is appealing not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso). Furthermore, our sense of how our body is configured in three-dimensional space allows us to accurately interact with our bodies in an eyes-free manner (Harrison, 2010, p. 11).

In terms of interface suitability and human needs, researchers had been looking for ways to provide the user with greater mobility and enable more and more interaction. However, and although this interaction with the new interface is greater, users do not have a clear mental model of its operation, since in some cases cease to be intuitive and demand to the users a constant relearning. However, several research areas offers possibilities for full body incorporation into the interfaces process, such as: speech recognition, gesture detection, computer vision, micro gestures, skin surface, body electricity, brain computing, and muscles gesture, among others.

A Current research that explores different ways to use the features of one’s own body for interacting with computers, presented by The Imaging Research Center of South Korea, has divided this area into four types of human body based interfaces:

  1. Body Inspired Metaphor (BIM): Uses various parts of the body as metaphoric interaction.
  2. Body As An Interaction Surface (BAIS): Uses parts of the body as points of interaction. In this model, researchers are investigating what parts of the human body are more suitable to be used as interface for a given task. They are trying to find the best spot taking into account cognitive and ergonomic factors. So far, they had found that one of the most plausible locations seems to be the forearm of the non-dominant hand for its mobility, accessibility to the dominant hand, and visibility, although other parts of the body may be considered, such as on the lap (Changseok, 2009, p. 264).
  3. Object-Mapping (OM): It transports the user into the location of the object by becoming it, and manipulates it from the first person viewpoint both physically and mentally.
  4. Mixed Mode (MM): A mix of BIM and BAIS.

To draw an example on how the hand, the body and now the skin are being used in digital interaction process, we could refer to the film starring Tom Cruise in which the computer interface is manipulated by the hands of touch; or just how to recognize a body movement or gesture as the recent "Nathan Project" (Commercially known as "Kinect") from Microsoft, as a development of an interface for the Xbox 360. No doubt we are in the era of the touchpad, but we were far from thinking that simply by gestures, we could handle an interface, such as the prototype Gesture Cube, a "cube" that interprets the movement of the hands and we can act with different devices without having to touch a hand tool and moving a short distance. This Gesture Cube has a series of sensors that instantly detect the position and transmits the coordinates to a CPU installed in its interior, so that some previously programmed motions allow the execution of a specific task such as opening a program, call someone, listening to music. 

Unlike what we may think, GestIC as its creators have called this cube, does not have sensors that read the position of the hands in an optical process, but instead is equipped with an array of sensors that are grouped in fours to measure the variation of the magnetic field generated by the human skin that is produced according to the variation in the distance. The interesting addition to this, is that the interface allows you to associate a different device to each of the faces of the cube.

Another area refers to muscle sensing. While muscle-sensing techniques promise to be a suitable mechanism for body interface, previous work suffers from several key limitations. In many existing systems, users are tethered to high-end equipment employing gel-based sensors affixed to users’ arms with adhesives (Saponas, 2009, p. 19). Other efforts developed experiments using motor neurons stimulate muscle fibers into the skeletal muscles causing movement or force. This process generates electrical activity that can be measured as a voltage differential changing over time. While the most accurate method of measuring such electrical activity requires inserting fine needles into the muscle, a noisier signal can be obtained using electrodes on the surface of the skin. So far these experiments have yielded little success. (Mastnik, 2008, p. 64).

However, a recent project, called Skinput, demonstrated by Microsoft research represents an enormous advance in this area. Skinput is an input technology that uses bio-acoustic sensing to localize finger taps on the skin. When augmented with a pico-projector, the device can provide a direct manipulation, graphical user interface on the body (For Example, in a person’s forearm). The technology was developed by Chris Harrison, Desney Tan, and Dan Morris, at Microsoft Research's Computational User Experiences Group.

Wearable computing and virtual reality would be ideal application areas for body interface technologies. For instance, one of the defining goals of the virtual reality system is to create the feeling of being in the environment and one cause of breaking presence is the existence of intrusive wired sensing devices. While body based interfaces may not increase realism, they may still find good uses for imaginary virtual worlds for increasing self-awareness through self-interaction (Changseonk, 2009, p. 271).

Another study conducted by Nokia Research Center suggested the concept of “virtual pockets” for opening and saving documents in a wearable computing setting. Virtual pockets are physical pockets augmented with pressure sensors on one’s clothing woven with a special material for tracking the finger position on the clothing surface. A user can move files between different pockets, a process analogous to the “dragging and dropping” in the familiar desktop environment. Using finger pressure, files can be opened or saved. This can be viewed as mapping the desktop space onto the front surface of the upper body (Changseok, 2009, p. 269).

When exploring the body as an interaction device, challenges are how to utilize the corporal potential in the interaction context; and what influence and significance the use of the body has on interactive experiences. Characteristics to be considered when utilizing the body include: small/large degree of bodily involvement in the interaction; less/large accentuation of the significance of the body in the user experience; and finally, small/large degree of user influence (Karen, 2008, p. 2). According the experts in the subject, body interfaces can contribute to reducing task completion time and errors because it is natural and less confusing to users. However, they also appoint that excessive moving of body parts can cause muscle fatigue. Therefore, not all tasks are suitable for association with body parts

Another trend in the quest to integrate the human body into the interface process states that in order to accomplish such goals, besides HCI other areas area such as electronics, bio informatics and science materials, must evolve in their own subject matter. A research work called Communications Trough Virtual Technologies and sponsored by Association of European Telecoms concluded that unobtrusive hardware miniaturization is assumed to permit the necessary enabling developments in micro and optical electronics that is required for the usage of the body as a computer interaction device. Molecular and atomic manipulation techniques will also be increasingly required to allow the creation of advanced materials, smart materials and nanotechnologies (Fabrizzio, 2009, p. 33). 

In addition to these conclusions, the same study adds that it is also required significant advancements in the areas of: 

a) Self-generating power and micro-power usage in devices.

b) Active devices such as sensors and actuators integrated with interface systems in order to respond to user senses, posture and environment that can change their characteristics by standalone intelligence or by networked interaction.

c) Nano devices to have lower power consumption, higher operation speeds, and ubiquity.

In the current stage of HCI research, a slight finger tap, an acoustic vibration in the air, a movement of the eyes and tongue, or a pulse in the muscle can become a method for information transmission, and people are not only interacting with computers, but also with every object around them (Hui, 2010, p. 1).

Citations and References


Changseok, .C., (2004). Body Based Interfaces. Proceedings of the Fourth IEEE International Conference on Multimodal Interfaces (ICMI’03)Fabrizzio, .D. (2009). Communications Through Virtual Technologies. Galimberti and G. Riva (eds.), La comunicazione virtuale, Guerini e Associati, Milano
Harrison, .S., (2010). Skinput: Appropriating the Body as an Input Surface. In Proceedings ACM CHI 2010 
Hui, .M. (2010). Human Computer Interaction, A Portal to the Future. Microsoft Research. 
Karen, .J. (2008). Interaction Design for Public Spaces. ACM MM’08, October 26–31, 2008, Vancouver, British Columbia, Canada.
Mastnik, S., (2008). EMG-based Hand Gesture Recognition for Realtime Biosignal Interfacing. Proceedings ACM IUI ‘08, 30-39.
Musilek, P. (2007). A Keystroke and Pointer Control Input Interface for Wearable Computers. In Proceedings IEEE PERCOM ’07
Saponas, T., (2009). Enabling Always-available Input with Muscle-Computer Interfaces. In Proceedings ACM UIST ’09.
Saponas, T., (2010). Making Muscle-Computer Interfaces More Practical. In Proceedings ACM CHI 2010.
Saponas, T. (2009). Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In Proceedings ACM CHI ’09.

Understanding Brain Computer Interfaces

The human brain and body are prolific signal generators. Recent technologies and computing techniques allow us to measure, process and interpret these signals. We can now infer such things as cognitive and emotional states to create adaptive interactive systems and to gain an understanding of user experience (Girouard, 2010).

Brain-computer interface (BCI) technology can be defined as an HCI system that can translate our mental intentions into real interaction within a physical or virtual world. The basic operations of a BCI are to measure brain activity, process it for obtaining the characteristics of interest; and after obtaining these characteristics, interact with the environment as desired by the user. From a standpoint of human-computer interaction, BCI like interfaces has two characteristics that make it unique compared to all existing systems. The first is its potential to build a natural communication channel with the human. The second, it potential to access cognitive and emotional information from the user. Our work intends to address the brain computer interface technology from a technological point of view by presenting their current context and technological problems and associated research. 

Computer interfaces as we normally know them are not natural in the sense that human thoughts must be translated in order to match the type of interface. For example, while using a keyboard, the thought of writing the letter "X", must be translated into a press of a finger on a given key. Although it is efficient and serves to accomplish the task, it does not represent a natural user interaction. In fact, if there is no training for it, the user would not know how to complete the operation. BCI interfaces in principle have access to the human cognitive information, as it is based on measuring brain activity, which is assumed to encode all these aspects. The scientific and technological challenge is to decode this information throughout the continuous and huge volume of data.

Current interfaces such as pointing devices, keyboards, or eye trackers, etc., are systems that convert the user control intentions into actions. However, there are not natural ways to model and implement the interaction, and in turn they lack of potential to access cognitive information such as workload, perception of system errors, affective information, etc. (Goriuard, 2010). BCI has the ability to build a natural communication channel for the human with the machine as it translates directly intentions into orders.

The idea behind this technology is very simple: it is to turn our thoughts into real actions around our environment. These actions can be directed to elements as simple as turning on or off the lights in our house, and up to a machine as complex as wheelchairs. The idea is simple but the technological challenge is enormous because it involves a highly multidisciplinary group of knowledge as the intersection of neuroscience, biomedical engineering and computer science. 

BCI seen as the machine that translates human intentions into action has at least three distinct parts (Minguez, 2009):

1) Sensor: is responsible for collecting brain activity. The vast majority of sensory modalities used in BCI from clinical applications, such as the electroencephalogram, functional magnetic resonance imaging, etc.
2) Signal Processing Engine: This module collects the signal measurement result of brain activity and applies filters to decode the neurophysiological process
reflecting the intention of the user.
3) Application: is the interaction module with the environment and shapes the final application of the BCI. May be moving a wheelchair or writing with the thought in a computer screen.

All research taking place in BCI can be classified within these three points. First, researches are working on new sensory modalities that enhance the temporal and spatial resolution measurements of brain activity, and improving the usability and portability of the devices in general. Second, much research is being conducted on strategies to address the BCI signal processing. The most relevant aspects that complicate the problem are that each individual has different brain activity and also the brain activity is non-stationary. The work is focused towards improving the filtering processes, automatic signal learning, and adaptation to each particular individual over time (McCullagh, 2010). The final aspect is to integrate the BCI in a useful application for the user, which is encouraging efforts in areas such as hardware and software integration and inclusion in actual application environments.

One important challenge that faces HCI research is the consideration about where to place the sensors or with respect to the human body. This election has important implications for usability, ethics and design of the system, since it determines the type of neuronal process that can be measured and processed later. If the sensor is placed so that no intrusion is performed on the human body is called non-invasive technique, which is the mostly used in BCI. However, other techniques exist that require performing a craniotomy, in this case we can talk about an invasive technique. Broadly speaking there are different levels of penetration and placement of sensor systems varying from penetrating the cerebral cortex to electrodes that measures the cortex activities for which the sensors are placed over the surface of the cortex. Beyond ethical problems with these invasive technologies, it faces the difficulty of maintaining a stable sensorial mechanism. Because a small movement of the sensor may involve a large movement at the cellular level causing the activation of body defenses attacking the “intrusive sensors” until it gets disabled (Ferrez, 2009).

Recover or replace human motor functions has been one of the most fascinating but frustrating areas of research of the last century. The possibility of interfacing the human nervous system with a robotic or mechatronic system, and use this concept to recover some motor function, has fascinated scientists for years (Minguez, 2009). The typical paradigm of work is a patient with severe spinal cord injury or a chronic neuromuscular disease that interrupts the flow of motor neural information to the body's extremities. One aspect that has enabled these developments has been the advance in technology since BCI are systems that allow real-time translating electrical activity result of thinking in order to directly control devices. This provides a direct communication channel from the central nervous system devices, avoiding the use of the neural pathways that can no longer be used normally because of the presence of severe neuromuscular diseases such as stroke, brain paralysis or spinal injuries (Ferrez, 2008). On the other hand, robotics has advanced enormously in the last years in various fields such as sensors, actuators, and processing capacity up autonomy

The first element in a BCI is a device for measuring brain activity, which is usually a clinical device that measures brain activity directly or indirectly. From all of the forms for measuring brain activity, electroencephalogram or EEG is one of the most widespread options. It is preferred by specialists because of its great adaptability, high temporal resolution, portability and range of possibilities derived from its clinical use. Normally, the installation of an EEG system requires a cap that fits over the head and usually includes integrated sensors for measuring the differential on the electrical potential. A conductive gel is applied to improve the conductivity between the scalp and the sensor (Ferrez, 2009). All sensors are connected to an amplifier that digitizes the signal and sends it to a computer. However, one of the biggest entry barriers for this technology is the use of the conductive gel that needs to be applied to the head. Current works related to this area focus on the elimination of this gel usage (Minguez, 2009).

There are many applications where we can think of related to this technology, such as entertainment, education, machinery operations, assistance for the elderly or physically challenged, etc. One of the first applications that are gaining terrain is the video game control by BCI and my means of the users’ thoughts. The qualitative leap achieved by the use of BCI in these technologies is enormous. Market studies shows that it will be one of the channels through which this technology will be introduced first. This is because video game users are a very large community, very tolerant to new technologies that spend many hours using the devices. This somehow facilitates the testing stages (Nijholt, 2008). 

Much research is also being conducted in what has been called intelligent environments. These involve intelligence embedded in the environment with capabilities of autonomous interaction with the user; with the clear objective to make life easier for people in different fields. For example: wearable computing. BCI in this context provides a direct communication channel with the environment to make control orders and in turn could provide information on cognitive and emotional status of the users, so the environment could make smarter decisions appropriate to each person (Ferrezm 2008). 

In 2007, a panel of experts to study the state of BCI technology worldwide was formed. The following research aspects were appointed. First efforts in this line are very significant in the U.S., Europe and in Asia, where clearly the amount of research in this area is to increase. Second, the current state of BCI is, if not about to, or entering into the generation of medical devices, but is expected to have a strong acceleration in non-technical areas and in more commercial environments such as video games, industrial automotive and robotics. Third, research efforts are oriented towards invasive technology in the United States, non-invasive in Europe and the synergy between the two types of interfaces and robotics in Japan. In the case of Asia and particularly China, has invested in programs of biological and engineering sciences, which has increased the investment in BCI and related areas. (Bergel, 2007). BCI research efforts throughout the world are extensive, with the magnitude of that research clearly on the rise. Even though, initial works on BCI focus on medical applications, BCI research is expected to rapidly accelerate in nonmedical arenas of commerce as well, particularly in the gaming, automotive, and robotics industries. 

Despite of the technological advancement, the operability of a BCI device in an out-laboratory setting (i.e. real-life condition) still remains far from being settled. The BCI control is indeed, characterized by unusual properties, when compared to more traditional inputs (long delays, noise with varying structure, long-term drifts, event-related noise, and stress effects). Current approaches to this are constituted by post hoc processing the BCI signal in order to better conform to traditional control (Cincotti, 2009). Being our input and output devices the major obstacles to effectively use computer tools and technology in general, it could be predicted that, in a moderate time (8-10 years), BCI will become an actual viable alternative to other input methods, like touchscreens, keyboards, and mice. 

Citations and References

Berger, T. (2007). International assessment of research and development in brain-computer interfaces. In: WTEC Panel Report.

Cinccotti, .F. (2010). Interacting with the Environment through Non-invasive Brain-Computer Interfaces. ACM UAHCI '09 Proceedings of the 5th International on Conference.

Ferrez, E. (2008). The use of brain-computer interfacing for ambient intelligence. LNCS, Springer Verlag.

Ferrez, P. (2009). Error-related eeg potentials generated during simulated brain-computer interaction. IEEE Transactions on Biomedical Engineering 55(3), 923–929.

Girouard, . A. (2010). Brain, body and bytes: psychophysiological user interaction. ACM CHI EA '10 Proceedings.

McCullagh, .P. (2010). Brain Computer Interfaces for inclusion. ACM AH '10 Proceedings of the 1st Augmented Human International Conference.

Minguez, .J. (2009). Brain Computer Interaction Technologies. Journals of Research group for Robotics and Real Time Perception. Department of Informatics. Universitat Stuttgart. No. 23l. Vol3. PP, 20-44

Nijholt, A. (2008). Bci for games: A ’state of the art’survey. ACM ICEC '08 Proceedings of the 7th International Conference on Entertainment Computing.