Ardern and Macron will later issue the Christchurch Call to fight the spread of hateful and terror-related content along with leaders from Britain, Canada, Norway, Jordan and Senegal, who will also be in Paris.
Now Ardern is turning her efforts toward another factor in the violence that day: the social media platforms on which the gunman live-streamed his attack.
Ardern and Macron will lead a meeting in Paris today that seeks to get world leaders and chiefs of technology companies sign a pledge to eliminate violent content online.
Representatives from Facebook, Alphabet Inc's Google, Twitter Inc and other tech companies are expected to be part of the meeting, although Facebook chief executive Mark Zuckerberg won't attend. In the 24 hours after the attack, the company scrambled to remove 1.5 million videos containing footage of the bloodshed.
"Today's announcement addresses a key component of the Christchurch Call, a shared commitment to making live streaming safer". Therefore, if a user posted content leading to a terrorist website, they'd be banned from livestreaming.
In an email to Reuters, Ardern called Facebook's new limits on live streaming "a good first step to restrict the application being used as a tool for terrorists". Arden called it a "good first step".
"One of the challenges we faced in the days after the attack was a proliferation of many different variants of the video of the attack", vice-president of integrity Rosen said.
The company said they plan to extend the restrictions over the coming weeks, including to the creation of ads, but did not lay out specific plans. As such, it pledged $7.5 million toward research with universities like Cornell and USC Berkeley on "image and video analysis technology". Also, there's no clarity on what happens after the 'set period, ' when the user regains the ability to post live videos.
Under the new policy, the alleged Christchurch shooter would not have been able to livestream the massacre from his account in March, a Facebook spokesperson told CNN Business.
"There's no word on what the rules are, so that makes it very hard to determine whether or not posts and video content are in breach of the rules".
Ardern has said the research was welcome and that edited and manipulated videos of the March 15 mosque shootings had been slow to be removed, resulting in many, including herself, seeing it played in Facebook feeds.