sabato, giugno 08, 2013

Autodesk ReCap + Webcam


Trying Autodesk ReCap technology..but what if..a pointcloud comes from a webcam? :-D
Here, me and Max during Autodesk EMEA Channel Summit 2014 in Milan.
Stay tuned for updates!

giovedì, gennaio 03, 2013

HowTo - Transparency/Alpha in MentalRay MIA materials

Create a mia_material_x_passes
Create a node with your color channel
Create a node with your alpha/transparency channel

















Connect the color channel using drag and drop (with MMB*), select Diffuse














Connect alpha transparency using drag and drop (with MMB*), select Other. Maya opens a new window with two columns: on the left column it displays the channels of your texture node, on the right column you see the channels of your mia_material_x_passes.
If your texture channel for the transparency is a image with alpha, select on the left, the outAlpha channel and then on the right, select the cutoutOpacity channel (you can find it deeper on the list, is one of the last channels..).

















Otherwise if your texture channel for the transparency is a b/n image as rgb basis, expand on the left the outColor channel, select outR and then on the right, select the cutoutOpacity channel (you can find it deeper on the list, is one of the last channels..).

*MMB = Middle Mouse Button.ù




venerdì, marzo 02, 2012

"The Interactive Facade"

As part of an ongoing research on the use of the Genetic Algorithms in the built environment, this project investigates on the potential of their application in an interactive façade design with a multi-objective optimization.
The objective was to develop an adaptive building façade to provide the optimum light conditions in the interior and to perform as interactive device for the exterior reacting to weather conditions and pedestrian activity. The façade act as a sort of medium to play and communicate with outside while maintain the privacy for the activities inside the building.



"The Interactive Facade" - The Making Of from Davide Madeddu on Vimeo.

"The Interactive Facade". An interactive facade optimized for daylighting and pedestrian response from Davide Madeddu on Vimeo.

"The Interactive Facade". Pt_2 An interactive facade optimized for daylighting and pedestrian response from Davide Madeddu on Vimeo.

"The Interactive Facade". Pt.3 An interactive facade optimized for daylighting and pedestrian response from Davide Madeddu on Vimeo.



Many thanks to Vectorealism for the great lasercut service!

mercoledì, settembre 28, 2011

Just for fun! Kinect + Maya


Kinect + Processing + Maya from Davide Madeddu on Vimeo.
A skeleton setup in Maya controlled with Kinect. Processing sends xyz coordinates of the joints to Maya with the UDP protocol. Unfortunately there is a small lag between Processing and Maya, but it works!


A simple HowTo!

**Premise: This is a very simple test, maybe you can find an easy way to do this or a more fast code.

I used SimpleOpenNI and UDP libraries in Processing and Python UDP Socket in Maya.
The code in Processing and Python is very simple, and implement UDP commands to send xyz coordinates to Maya via UDP protocol with Python.

In Processing you can start from scratch or open a basic SimpleOpenNI example and edit it. Basically, I followed these steps:
First of all you will need to install the UDP libraries in Processing ;-)
Define UDP object
Create the PVector to store the joint coordinates
Define the UDP message with the values of PVector.x,PVector.y,PVector.z
Use void drawSkeleton from SimpleOpenNI libraries.

You can download the Processing sketch <here>

In Maya was very simple too!
I scripted in Python (I am learning it!) and I used the UDP protocol to receive the coordinates from Processing. You can find all UDP commands searching UDP python over Internet and you will apply the same syntax to Maya.
First I created a series of locators (you can also script this step) named as bones, then I created the skeleton and constrained the joints to the locators.
On the "script side" I imported socket, I defined UDP IP, PORT, etc (must be the same as in Processing). I used a simple syntax found in UDP help reference, splitted the incoming message and assigned xyz to the locators with setAttr.
A boolean stop the streaming when is pressed the "q" key.

You can download the Python script for Maya <here>

Have fun and feel free to contact me if you want! Bye


mercoledì, settembre 07, 2011

Kinect + Grasshopper + Arduino (updates)

A simple responsive model is influenced by the people's gestures captured with the Kinect. In this update, using Arduino, I controlled a very simple prototipe with leds that reflects the geometry changes in the digital model. More updates soon..


Kinect + Arduino from Davide Madeddu on Vimeo.


kinect + arduino :: screen capture from Davide Madeddu on Vimeo.

mercoledì, agosto 17, 2011

Kinect + Grasshopper

More updates soon..stay tuned!

mercoledì, agosto 10, 2011

Thanks to Rafael Roa for this new Smartgeometry2011_IwC video!

Rafael Roa (Lem3A http://www.lem3a.es) has pubblished this new video that describe in depth the work done in Interacting with Cities cluster in Smartgeometry 2011.
Video credits: Rafael Roa - http://www.vimeo.com/26565217 or http://www.lem3a.es

sabato, aprile 16, 2011

Back from Smartgeometry 2011 Copenhagen

From 28th March to 2nd April I had been in Copenhagen and I attended the Smartgeometry Workshop and Conference. This year the venue was the Royal Academy of Fine Arts School of Architecture hosted by the Centre for IT and Architecture (CITA) in Copenhagen.
I participated in the Interacting with the City (IwC) cluster, led by Przemek Jaworski, Flora Salim, and Martin Kaftan. The cluster have worked with a series of tangible table prototypes using Kinect and the online real time data sources as Google Maps, Twitter and weather data.

There was different setups, where the user can interact in real time with the digital model with hands gestures (single and multi touch), or physical objects on the table. The movements and gestures are captured with cameras or kinect controller that is also capable to scan the entire scene as depth data and use the resulting point cloud information to reconstruct the 3D geometry.

My group was interested in environmental and climatic responsive design and we developed the project called “Hands on Ofelia’s Beach” (Ofelia Beach is the Royal Danish Theatre's outdoor stage and lounge area on Kvæsthusmolen in central Copenhagen).
The project explores the interaction with the digital model through the use of tangible objects to experiment form and shape. The architects and designers can use a physical building blocks, made of foam or paper, to explore the effect of the design on the site in regard to the sun and wind conditions. Interactively the designer can try to find the best configuration of the blocks that can perform the best local wind conditions and the best exposure to solar radiation. 

We used Kinect as 3D geometry scanner to capture the objects on the table, and the real time data from the Copenhagen Airport weather station. The 3D point cloud from the Kinect and the wind direction and speed is used in Processing to calculate the 2D flow simulation around the objects using a Navier-Stokes algorithm (Based on code by Martin Kaftan) .
The physical model captured from Kinect with the Processing environment is continually being streamed to the digital model in Bentley GenerativeComponents (GC) giving the opportunity to record the different design stages and continue the design in a digital environment. The connection is made by UbiMash (developed by Flora Slim), which is able to export the point cloud and convert it into 3D geometry. Inside GC was possible to connect the model with Ecotect using the gc-ecotect-link to get the incident solar radiation and the incident daylight levels  of the objects on the table.  

 The connection between software was pretty fast (excluding the computation time that depends on accuracy and mesh complexity) and allows to get the analysis values directly inside GC and could be used as input to new features as shading devices or openings. The next step was to improve the design with the integration of user defined feature to develop the design further. Any changes in the physical model is computed and updates the digital one, computing new analysis and visualized back to the table giving a interactive design feedback loop.





SmartGeometry 2011 Copenhagen
Interacting with City Cluster (Przemek Jaworski, TU Wroclaw – Flora Salim, SIAL, RMIT – Martin Kaftan, Echorost)
Group 1 “Hands on Ofelia’s Beach” (Davide Madeddu, DiArch – Raul Kalvo, Estonian Academy of Arts – Jakob Bak, Danish Architecture Centre – Tore Banke, CITA)

lunedì, marzo 07, 2011

My first attemps to the Genetic Algorithms

So much time from the last post! So..I would post here my first attemp to the genetic algorithms from months ago. I tried to make a GA script, with functions to make population, genes, fitness, and so on..
My script is based on the helpful Ben Doherty's script, and I tried to make a new one for my purposes..
This is a first test. It is very simple..there is a simple geometry made of polygons (like a mesh), that can change the coordinates of their vertices (the Z) to get the best exposure of the faces to the sun (rapresented by a point). The algorithm check the DotProduct between the normal of the faces and the direction of the sun, until they reach the best approximating value of 1.0.
Yes, I know, you can't find the gct file here! ..and there isn't no more images..please be patient! I will post some updates....the script is still in work in progress..and..isn't "rock" yet as seems!

venerdì, novembre 19, 2010

giovedì, novembre 18, 2010

:-) Arduino

Few weeks ago I ordered Arduino Duemilanove from Fritzing with a nice starter kit.
Now I am ready to start my little experiments with Arduino! I hope to post here very soon..
Few days ago was released Uno <here>, the last Arduino version..but there are still some pros for Duemilanove as reported here <LadyAda FAQ>.

mercoledì, maggio 05, 2010

Cellular Automata in GenerativeComponents

I have done a simple test in GenerativeComponents..

Rule 30

mercoledì, marzo 31, 2010

Back from SmartGeometry 2010 @ IaaC Barcelona

My photos fromSmartGeometry 2010 @ IaaC Barcelona


I was there!!!
My workshop cluster was Deep Surfaces more info here

Article from Frame And Form <eng>

From SmartGeometry website
The SmartGeometry Group is pleased to announce that the annual SG event will take place in Barcelona 19th – 24th March 2010. The event will be the key event in the year for discussing, learning and networking about the emerging practice of digital parametrics within the AEC community.

The event will come in three parts, a Workshop (19th-22th March), a public Shop Talk (23rd March), and a public Symposium and Reception (24th March). These events follow the highly successful previous SG events in San Francisco 2009, Munich 2008, New York 2007, Cambridge/London, UK 2006 and multiple preceding events.
Annually, the SG workshop attracts an impressive group of tutors and attendees from across the world of academia, professional practice as well as many of the brightest students. The workshop is open to 100 applicants who come together for four intensive days of design and collaboration. Admission to the workshop is based on selection by the SmartGeometry Group.
The 2010 SmartGeometry Workshop is organised around the challenge ‘Working prototypes’. Working prototypes are functioning prototypes developed for the purpose of proving and testing a concept and design. In 2010 the workshop will be held in the great industrial space at the
IAAC in Barcelona. This allows possibilities for fabrication on even a 1:1 scale, of creating material stress, physical thinking and mental gymnastics. This puts the physical at centre stage in challenging participants to design, assemble and test working prototypes

<link1> <link2

sabato, febbraio 20, 2010

Elegant Ecotones GH Workshop


Grasshopper workshop Level I - parametric & generative design
Co-de-iT _ Bologna _ 15-19 marzo 2010

domenica, gennaio 31, 2010