Monthly Archives: May 2013

ML trip to New York

Washington Square Park

Hi readers, I am now in New York, a city that never sleeps and one of the cities where many great-mind in science reside (many great machine learners also live here).

It is good to be here. I will take this opportunity to interact with people who are working in different fields, such as astrophysics, particle physics, computer vision, etc, and hopefully learn something new.

The primary goal of this trip is to visit my advisor, Prof. Bernhard Scholkopf, who is visiting NYU for three months and to finish our nips paper. Also, another goal is to continue a collaboration with David Hogg and Jo Bovy on quasar target selection and see if we can continue our collaboration in another direction.

I started on Monday when Dustin Lang from CMU also visited David and Bernhard for three days to work on something about image denoising. I am very impressed by how much job they could get done in three days. Bernhard also told me about the idea of inferring the CCD sensitivity from image patches, which I find very interesting. Dustin also took us to the company where one of his friends works called Etsy. It's website company that sells hand-made stuffs. We had a quick tour inside the company and the office is quite relaxing.

While everyone was busy, I tried my best to finish the first draft for our nips paper. It's now in its final shape.

Think Coffee

On Friday, we hanged out with Will Freeman from MIT. I met Will at the Astroimaging workshop in Switzerland. We spent the whole morning together with Bernhard, David, Rob, Ross, etc, discussing random stuffs. Will then gave a talk in the afternoon about his work on image/motion amplification. It's very cool stuffs.

I am now looking forward to another exciting week.

ICML 2013

It is very exciting to see many interesting papers at ICML this year (see http://icml.cc/2013/?page_id=43 for a list of accepted papers). It is also good to see that several papers are co-authored by the AGBS members.

This year, I have been involved in two ICML papers, both of which are in the area of kernel methods and transfer learning. The first paper is

Domain Generalization via Invariant Feature Representation
K. Muandet (MPI-IS), D. Balduzzi (ETH Zurich), and B. Schoelkopf (MPI-IS)

As opposed to domain adaptation, where one usually assume that the data from the target domain is available during training, domain generalization solves the problem without that assumption by collecting information from several source domains and, given the data from the target domain, infer the target domain during the test time. The paper is already available online (see the link above).

The second paper is

Domain Adaptation under Target and Conditional Shift
K. Zhang (MPI-IS), B. Scoelkopf (MPI-IS), K. Muandet (MPI-IS), and Z. Wang (MPI-IS)

The work investigates the domain adaptation problem when the conditional distribution also changes, as opposed to previous setting where only the marginal can change. We make use of the knowledge from causality to solve this problem. The paper will be available soon.