Facebook’s COO Sheryl Sandberg has admitted that a week-long psychological experiment involving almost 700,000 unaware users was “poorly communicated.”

As the Wall Street Journal reports, Sandberg said:

This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologize. We never meant to upset you.

These controversial tests were brought to the public eye by a research paper. Data scientist Adam Kramer led a team of researchers who altered the News Feed algorithm to show 689,000 users a larger percentage of either positive or negative content. The content was unedited—it all came from people within their network; the company simply altered which posts appeared first in the News Feed.

The study showed that users’ posts were influenced by which posts (positive or negative) were shown to them in their News Feed. Those shown positive content were, on average, more positive with their Facebook activity during the experiment. Those shown negative content tended to be more negative with their activity.

The study has spurred some questions as to whether Facebook breached ethical guidelines around informed consent.

Kramer also wrote a long post on Facebook to apologize and explain why the study was conducted without users’ knowledge.

“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out,” he wrote. “At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”

“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused,” he said, a bit more eloquently than Sandberg’s “we never meant to upset you.”

The UK’s Information Commissioner’s Office is now investigating Facebook over its handling of the experiments.